CrawlJobs Logo

AI Data Engineer

https://www.hpe.com/ Logo

Hewlett Packard Enterprise

Location Icon

Location:
United States , San Juan

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The AI Data Engineer role involves designing and implementing cloud platforms for big data exploration, deploying data science solutions into cloud environments, and collaborating with teams on cloud technologies and workflows. The position focuses on cost-saving strategies, CI/CD pipelines, orchestration, and containerization.

Job Responsibility:

  • Research, propose, design, implement, operate and maintain cloud platforms for big data exploration and visualization, in support of a team of data scientists
  • deploy data science solutions into cloud environments
  • work with data scientists to troubleshoot cloud workflows
  • closely collaborate with our datalake team on cloud technologies
  • identify and implement cost-saving strategies to reduce ongoing cloud expenses
  • build CI/CD pipelines
  • deploy and maintain orchestration and monitoring systems for big data processing
  • help build images and containerize applications

Requirements:

  • Bachelor’s degree in computer science, engineering, information systems, or closely related quantitative discipline
  • 4-7 years’ experience
  • strong programming skills in Python, Java, Golang, or JavaScript
  • good understanding of distributed systems, event-driven programming paradigms, and designing for scale and performance
  • experience with cloud-native applications, developer tools, managed services, and next-generation databases
  • knowledge of DevOps practices like CI/CD, infrastructure as code, containerization, and orchestration using Kubernetes
  • good written and verbal communication skills
  • comfortable with AWS services
  • familiarity with the landscape of big data exploration, visualization, and prototyping platforms
  • familiarity with statistical and machine learning techniques
  • familiarity with wireless and wired networking protocols

Nice to have:

  • Cloud Architectures
  • Cross Domain Knowledge
  • Design Thinking
  • Development Fundamentals
  • DevOps
  • Distributed Computing
  • Microservices Fluency
  • Full Stack Development
  • Release Management
  • Security-First Mindset
  • User Experience (UX)
What we offer:
  • Comprehensive suite of benefits that supports physical, financial, and emotional wellbeing
  • specific programs catered to professional development
  • inclusive working environment

Additional Information:

Job Posted:
July 31, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for AI Data Engineer

New

Senior Data & AI Innovation Engineer

We are seeking a highly proactive, self-driven Senior Data & AI Engineer to serv...
Location
Location
Singapore , Singapore
Salary
Salary:
7000.00 - 8000.00 SGD / Month
https://www.randstad.com Logo
Randstad
Expiration Date
January 08, 2026
Flip Icon
Requirements
Requirements
  • Proven, hands-on experience in implementing and supporting practical AI use cases (beyond academic study), understanding how to embed AI components into existing services
  • 4+ years of hands-on experience in implementing and operating Snowflake Data Cloud in a production environment
  • Certification (e.g., SnowPro Data Engineer) is highly desirable
  • Familiarity with MLOps concepts and tools (e.g., Docker, MLflow, LangChain) and an understanding of LLMs, RAG pipelines, and generative AI deployment
  • Strong programming skills in Python for data manipulation, scripting, and AI model support
Job Responsibility
Job Responsibility
  • Proactively identify, design, and implement initial AI Proof-of-Concepts (POCs) across the APAC region, focusing on quick-win solutions like AI-powered chatbots and intelligent inventory monitoring systems
  • Analyze business processes to identify areas where AI components can be effectively embedded to solve immediate business challenges
  • Partner with business stakeholders to understand AI data needs, perform data engineering/prep, and ensure data readiness to support and sustain deployed AI models
  • Stay ahead of technology trends, perform proactive research on Data and AI solutions, and evangelize new capabilities to regional teams
  • Act as the APAC SME, collaborating closely with cross-regional peers and global teams to contribute to and align with the company Global Data Platform roadmap (Snowflake)
  • Define and execute the complete migration strategy from legacy data warehouses/databases (e.g., PostgreSQL, MS SQL) to the Snowflake Data Cloud platform
  • Design, build, and optimize scalable, robust ETL/ELT data pipelines to curate raw data into BI and Advanced Analytics datasets
  • Implement and manage Snowflake governance, including access control, data security, usage monitoring, and performance optimization aligned with global best practices
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Quality Engineer - AI and Data Platforms

This is a pioneering Quality Engineer role at the intersection of data engineeri...
Location
Location
United Kingdom , Manchester
Salary
Salary:
44000.00 - 66000.00 GBP / Year
matillion.com Logo
Matillion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid foundation in data engineering: including SQL, ETL/ELT design, and specific experience building data pipelines and managing data movement
  • Strong practical AI experience: you have used, experimented with, and are an advocate for an AI-first approach to quality engineering
  • Proficiency in coding in Java or JavaScript to navigate the codebase and implement quality frameworks
  • Demonstrated Autonomy, Curiosity, and Problem-solving skills, with a willingness to look at challenges in a different way and ask for assistance as needed
  • Experience in managing end-to-end testing of SaaS applications, including developing and maintaining efficient test automation tooling
Job Responsibility
Job Responsibility
  • Leveraging AI and agentic solutions, including our agentic AI product Maia, to accelerate investigation, generate test cases, and increase quality assurance across the Data Productivity Cloud
  • Performing root cause analysis on pipeline stability issues, particularly identifying why DPC pipelines run out of memory (OOM) within the agents
  • Building pipelines to automate every process, solutionizing problems to increase overall team and product productivity
  • Acting as a crucial bridge by collaborating extensively with various teams, raising problems, and ensuring that fixes are implemented effectively
  • Adopting, implementing, and championing shift-left testing practices across the team, leading an automation-first approach
What we offer
What we offer
  • Company Equity
  • 30 days holiday + bank holidays
  • 5 days paid volunteering leave
  • Health insurance
  • Life Insurance
  • Pension
  • Access to mental health support
  • Fulltime
Read More
Arrow Right

Data Engineer with Generative AI Expertise

We are looking for a skilled Data Engineer with expertise in Generative AI to jo...
Location
Location
India , Jaipur
Salary
Salary:
Not provided
infoobjects.com Logo
InfoObjects
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related fields
  • 2-6 years of hands-on experience in Data Engineering
  • Proficiency in Generative AI frameworks (e.g., GPT, DALL-E, Stable Diffusion)
  • Strong programming skills in Python, SQL, and familiarity with Java or Scala
  • Experience with data tools and platforms such as Apache Spark, Hadoop, or similar
  • Knowledge of cloud platforms like AWS, Azure, or GCP
  • Familiarity with MLOps practices and AI model deployment
  • Excellent problem-solving and communication skills
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust data pipelines and workflows
  • Integrate Generative AI models into existing data systems to enhance functionality
  • Collaborate with cross-functional teams to understand business needs and translate them into scalable data and AI solutions
  • Optimize data storage, processing, and retrieval systems for performance and scalability
  • Ensure data security, quality, and governance across all processes
  • Stay updated with the latest advancements in Generative AI and data engineering practices
Read More
Arrow Right

Principal Consulting AI / Data Engineer

As a Principal Consulting AI / Data Engineer, you will design, build, and optimi...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
dyflex.com.au Logo
DyFlex Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven expertise in delivering enterprise-grade data engineering and AI solutions in production environments
  • Strong proficiency in Python and SQL, plus experience with Spark, Airflow, dbt, Kafka, or Flink
  • Experience with cloud platforms (AWS, Azure, or GCP) and Databricks
  • Ability to confidently communicate and present at C-suite level, simplifying technical concepts into business impact
  • Track record of engaging senior executives and influencing strategic decisions
  • Strong consulting and stakeholder management skills with client-facing experience
  • Background in MLOps, ML pipelines, or AI solution delivery highly regarded
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable data and AI solutions using Databricks, cloud platforms, and modern frameworks
  • Lead solution architecture discussions with clients, ensuring alignment of technical delivery with business strategy
  • Present to and influence executive-level stakeholders, including boards, C-suite, and senior directors
  • Translate highly technical solutions into clear business value propositions for non-technical audiences
  • Mentor and guide teams of engineers and consultants to deliver high-quality solutions
  • Champion best practices across data engineering, MLOps, and cloud delivery
  • Build DyFlex’s reputation as a trusted partner in Data & AI through thought leadership and client advocacy
What we offer
What we offer
  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
  • Structured career advancement pathways with opportunities to lead large-scale client programs
  • Exposure to diverse industries and client environments, including executive-level engagement
  • Fulltime
Read More
Arrow Right

Consulting AI / Data Engineer

As a Consulting AI / Data Engineer, you will design, build, and optimise enterpr...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
dyflex.com.au Logo
DyFlex Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Hands-on data engineering experience in production environments
  • Strong proficiency in Python and SQL
  • Experience with at least one additional language (e.g. Java, Typescript/Javascript)
  • Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
  • Background in building ML pipelines, MLOps practices, or feature stores is highly valued
  • Proven expertise in relational databases, data modelling, and query optimisation
  • Demonstrated ability to solve complex technical problems independently
  • Excellent communication skills with ability to engage clients and stakeholders
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Job Responsibility
Job Responsibility
  • Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
  • Manage and optimise databases, warehouses, and cloud storage solutions
  • Implement data quality frameworks and testing processes to ensure reliable systems
  • Design and deliver cloud-based solutions (AWS, Azure, or GCP)
  • Take technical ownership of project components and lead small development teams
  • Engage directly with clients, translating business requirements into technical solutions
  • Champion best practices including version control, CI/CD, and infrastructure as code
What we offer
What we offer
  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, remote working, additional purchased leave, and company-provided laptop
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
  • Structured career advancement pathways with mentoring from senior engineers
  • Exposure to diverse industries and client environments
  • Fulltime
Read More
Arrow Right

Data & AI Impact Consultant Engineer

As a Data Consultant, you are a cornerstone of our Data & AI Business Unit – tec...
Location
Location
Belgium
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering or analytics projects
  • Good knowledge of Dutch and English (French is an asset)
  • Familiar with Azure stack (Data Factory, Synapse, Storage, Purview, Functions) and/or Databricks
  • Structural and flexible thinking
  • Interest in AI and its role in modern data products (prompt engineering, GenAI, monitoring, automation)
  • Ability to create business impact and understand outcomes
  • Team-oriented mindset
Job Responsibility
Job Responsibility
  • Design and build modern data platforms (Azure, Databricks, Data Fabric) with focus on reusability and AI-readiness
  • Deliver customer- and future-oriented value
  • Help colleagues grow through coaching, feedback, or knowledge sharing
  • Provide consultancy with solution-oriented approach
  • Take initiative in client development, talent growth, or community engagement
What we offer
What we offer
  • Company car
  • Fuel/charging card
  • Group insurance
  • Hospitalization coverage
  • 32 days of annual leave
  • Hybrid working options
  • Satellite offices
  • Continuous learning & development
  • Training and certification programs
  • Fulltime
Read More
Arrow Right

Data & AI Impact Consultant Engineer

Data Consultant role in Data & AI Business Unit, designing and building modern d...
Location
Location
Belgium
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering or analytics projects
  • Good knowledge of Dutch and English (French is an asset)
  • Familiarity with Azure stack (Data Factory, Synapse, Storage, Purview, Functions) and/or Databricks
  • Structural and flexible thinking
  • Interest in AI and its role in modern data products (prompt engineering, GenAI, monitoring, automation)
  • Ability to create business impact and understand outcomes
  • Team-oriented mindset
Job Responsibility
Job Responsibility
  • Design and build modern data platforms (Azure, Databricks, Data Fabric) with focus on reusability and AI-readiness
  • Deliver value today while preparing for tomorrow
  • Help colleagues grow through coaching, feedback, or knowledge sharing
  • Provide consultancy with solution-oriented approach
  • Take initiative beyond projects to help build Inetum
What we offer
What we offer
  • Company car
  • Fuel/charging card
  • Group insurance
  • Hospitalization coverage
  • 32 days of annual leave
  • Hybrid working options
  • Satellite offices
  • Training and certification programs
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.