CrawlJobs Logo

Data Engineer - Automation & AI

riverflex.com Logo

Riverflex

Location Icon

Location:
United Arab Emirates , Abu Dhabi or Dubai

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Riverflex is partnering with a leading financial institution in the UAE on a strategic Proof of Concept focused on increasing data engineering productivity through automation and AI. We are seeking a Data Engineer – Automation & AI with strong GenAI and agent-based experience to own this PoC end-to-end, applying AI- and agent-assisted techniques to the design, migration, and development of AWS-based data pipelines. This role is explicitly focused on accelerating data engineering ways of working using AI, not solely on building pipelines. You will act as the internal lead on how AI should be applied in day-to-day data engineering, working closely with the partner’s Data Lead and embedded with the on-site engineering team. A key part of the scope is translating legacy SQL and stored procedures into modern AWS Glue pipelines, while defining practical AI patterns, tool and guardrails that scale beyond the PoC.

Job Responsibility:

  • Data engineering & pipeline delivery: Design, build, and evolve AWS Glue–based data pipelines using Spark and SQL
  • Translate legacy SQL scripts and stored procedures into AWS Glue pipelines
  • Ensure migrated and newly built pipelines meet agreed standards for correctness, performance, and maintainability
  • AI-driven engineering acceleration: Apply Generative AI and agent-based techniques to accelerate data engineering tasks, including code generation/refactoring, pipeline dev. and standardisation
  • Own the design and implementation of AI-assisted tooling that integrates directly into day-to-day engineering workflows
  • Codify successful patterns, reusable tools, and recommended ways of working for scaling beyond the PoC
  • AI tooling & experimentation: Work hands-on with Python and LLM APIs to build pragmatic, internal DE tools
  • Design effective prompts & interaction patterns for code generation & transformation
  • Evaluate and work with enterprise-grade AI platforms (e.g. AWS Bedrock, Azure AI Foundry) using GPT-4 / Claude-class models
  • Define practical rules of thumb and guardrails (e.g. where automation works, where it breaks down, where human intervention is required)
  • Collaboration & ways-of-working: Work closely with data and platform engineers to (dis)prove automation hypotheses and identify where AI adds real productivity gains vs. noise
  • Document outcomes and recommendations from the PoC and provide clear guidance on how AI should (and should not) be used in data engineering at scale

Requirements:

  • 6+ years of experience in data engineering or closely related engineering roles
  • Proven experience owning and shaping data engineering solutions, not only implementing individual pipelines
  • Strong hands-on experience with AWS-based data engineering, including: AWS Glue (jobs, transformations, orchestration), Spark (batch processing and transformations), Advanced SQL (complex logic, optimisation, performance tuning), End-to-end pipeline and workflow design
  • Solid (Python) engineering experience, including building reusable components and internal tooling
  • Demonstrated, practical experience applying Generative AI in engineering workflows, such as: Working with LLM APIs (e.g. AWS Bedrock, Azure AI Foundry, OpenAI), Prompt design for code generation, refactoring, and transformation, Understanding the limitations, failure modes, and risks of LLM-based automation
  • Experience designing AI-assisted engineering workflows or tools, for example: API-based services (e.g. FastAPI), MCP (or agent)-like orchestration patterns
  • Able to balance short-term PoC delivery with longer-term capability building
  • Experience in financial services or other regulated environments is a strong advantage
  • Ability to be based in the UAE for a minimum of 3 months, working full-time on-site (Abu Dhabi or Dubai)

Additional Information:

Job Posted:
January 10, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer - Automation & AI

Data & AI Impact Consultant Engineer

Data Consultant role in Data & AI Business Unit, designing and building modern d...
Location
Location
Belgium
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering or analytics projects
  • Good knowledge of Dutch and English (French is an asset)
  • Familiarity with Azure stack (Data Factory, Synapse, Storage, Purview, Functions) and/or Databricks
  • Structural and flexible thinking
  • Interest in AI and its role in modern data products (prompt engineering, GenAI, monitoring, automation)
  • Ability to create business impact and understand outcomes
  • Team-oriented mindset
Job Responsibility
Job Responsibility
  • Design and build modern data platforms (Azure, Databricks, Data Fabric) with focus on reusability and AI-readiness
  • Deliver value today while preparing for tomorrow
  • Help colleagues grow through coaching, feedback, or knowledge sharing
  • Provide consultancy with solution-oriented approach
  • Take initiative beyond projects to help build Inetum
What we offer
What we offer
  • Company car
  • Fuel/charging card
  • Group insurance
  • Hospitalization coverage
  • 32 days of annual leave
  • Hybrid working options
  • Satellite offices
  • Training and certification programs
  • Fulltime
Read More
Arrow Right

Data & AI Impact Consultant Engineer

As a Data Consultant, you are a cornerstone of our Data & AI Business Unit – tec...
Location
Location
Belgium
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering or analytics projects
  • Good knowledge of Dutch and English (French is an asset)
  • Familiar with Azure stack (Data Factory, Synapse, Storage, Purview, Functions) and/or Databricks
  • Structural and flexible thinking
  • Interest in AI and its role in modern data products (prompt engineering, GenAI, monitoring, automation)
  • Ability to create business impact and understand outcomes
  • Team-oriented mindset
Job Responsibility
Job Responsibility
  • Design and build modern data platforms (Azure, Databricks, Data Fabric) with focus on reusability and AI-readiness
  • Deliver customer- and future-oriented value
  • Help colleagues grow through coaching, feedback, or knowledge sharing
  • Provide consultancy with solution-oriented approach
  • Take initiative in client development, talent growth, or community engagement
What we offer
What we offer
  • Company car
  • Fuel/charging card
  • Group insurance
  • Hospitalization coverage
  • 32 days of annual leave
  • Hybrid working options
  • Satellite offices
  • Continuous learning & development
  • Training and certification programs
  • Fulltime
Read More
Arrow Right

AI-First Automation Engineer

Instawork is on a mission to create meaningful economic opportunities for skille...
Location
Location
United States , San Francisco
Salary
Salary:
120000.00 - 150000.00 USD / Year
instawork.com Logo
Instawork
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 1+ years of experience in AI model development and proficient in testing, evaluating and setting model quality standards
  • 2+ years of experience in QA automation
  • BS or MS in Computer Science or related field
  • Programming experience in Python + Javascript or similar language
  • Thorough knowledge of automation testing frameworks, tools and release processes
  • Self-motivated, especially in learning new software tools and best practices in software quality assurance
  • Startup mentality: self-starter, multi-tasker, proactive and flexible
  • Good communication and interpersonal skills
  • You straddle a fine line between QA, software and DevOps engineering
Job Responsibility
Job Responsibility
  • Develop customized AI models for QA data and conduct prompt engineering analysis
  • Implement and evolve QA processes to get effective test signals and scale testing efforts across multiple products
  • Build advanced automation architecture and integrate AI tools, thereby reducing manual intensive repetitive tasks
  • Transform our automation by integrating cutting-edge AI tools and methodologies, creating smarter, faster, and more resilient frameworks that elevate our efficiency and coverage standards
  • Help solve cross-platform testing challenges and contribute impactful ideas to improve quality
  • Build a self-optimizing QA system that scales with the growth of our products
  • Partner with engineering and infrastructure teams to leverage automation for scalable solutions to prevent regressions and ensure reliability of products
What we offer
What we offer
  • Variety of medical, dental, and vision plans with coverage beginning on the date of hire
  • Flexible paid time off
  • At least 8 paid company holidays annually
  • Phone stipend
  • Commuter stipend
  • Supplemental pay on qualified leaves
  • Employee health savings accounts (HSA) contribution
  • Flexible spending plans
  • 401K plan
  • Perkspot - discount program through Lumity
  • Fulltime
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Data Engineer – AI Insights

We are looking for an experienced Data Engineer with AI Insights to design and d...
Location
Location
United States
Salary
Salary:
Not provided
thirdeyedata.ai Logo
Thirdeye Data
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of Data Engineering experience with exposure to AI/ML workflows
  • Advanced expertise in Python programming and SQL
  • Hands-on experience with Snowflake (data warehousing, schema design, performance tuning)
  • Experience building scalable ETL/ELT pipelines and integrating structured/unstructured data
  • Familiarity with LLM and RAG workflows, and how data supports these AI applications
  • Experience with reporting/visualization tools (Tableau)
  • Strong problem-solving, communication, and cross-functional collaboration skills
Job Responsibility
Job Responsibility
  • Develop and optimize ETL/ELT pipelines using Python, SQL, and Snowflake to ensure high-quality data for analytics, AI, and LLM workflows
  • Build and manage Snowflake data models and warehouses, focusing on performance, scalability, and security
  • Collaborate with AI/ML teams to prepare datasets for model training, inference, and LLM/RAG-based solutions
  • Automate data workflows, validation, and monitoring for reliable AI/ML execution
  • Support RAG pipelines and LLM data integration, enabling AI-driven insights and knowledge retrieval
  • Partner with business and analytics teams to transform raw data into actionable AI-powered insights
  • Contribute to dashboarding and reporting using Tableau, Power BI, or equivalent tools
  • Fulltime
Read More
Arrow Right

Analyst – Automation, AI and Data Analysis

This is a role for someone with strong technical and analytical skills that is l...
Location
Location
Australia , Melbourne
Salary
Salary:
Not provided
frontieradvisors.com.au Logo
Frontier Advisors
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Finance, Commerce, Science, or Engineering undergraduate degree
  • Problem solving skills and a keen attention to detail are critical
  • Strong Excel competency, including experience with formulas and pivot tables
  • Experience in roles requiring data administrative functions highly regarded
  • Tech savvy with the ability to streamline processes and systems to optimise efficiency outcomes
  • Skills to evaluate AI outputs for accuracy, bias, and appropriateness while understanding AI limitations and potential ethical concerns
  • Working knowledge of Python and SQL
  • comfortable using tools such as GitHub Copilot to speed up development, with sound judgement to review and validate generated code.
Job Responsibility
Job Responsibility
  • Be at the forefront of our drive for innovation in operational efficiency by building lightweight AI and automation solutions using Python and SQL (e.g. data validation, templated reporting, summarisation)
  • Write clear, testable AI prompts and templates tailored to Frontier’s style and use-cases
  • fact-check outputs within AI tools and against source data
  • Data administration: Assisting with tasks associated with ongoing data governance including clean up, mapping, management and analysis as required
  • Platform administration: Engaging in support of internal and external platforms and processes to provide guidance and assistance to users
  • Assisting with data mapping and maintenance to ensure the data feeding into Frontier Advisors proprietary systems is accurate, timely and available within our agreed schedule
  • Identifying improvements to current data processes, and where appropriate, assisting with the implementation of new processes
  • Provide support to the Research Team including being an active participant in the manager research process, including the manager review and rating process, the preparation of due diligence documents such as annual reviews, MAPS and other documents.
Read More
Arrow Right

AI Research Engineer, Data Infrastructure

As a Research Engineer in Infrastructure, you will design and implement a robust...
Location
Location
United States , Palo Alto
Salary
Salary:
180000.00 - 250000.00 USD / Year
1x.tech Logo
1X Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong experience in building data pipelines and ETL systems
  • Ability to design and implement systems for data collection and management from robotic fleets
  • Familiarity with architectures that span on-robot components, on-premise clusters, and cloud infrastructure
  • Experience with data labeling tools or building dataset visualization and annotation tooling
  • Proficiency in creating or applying machine learning models for dataset organization and automated labeling
Job Responsibility
Job Responsibility
  • Optimize operational efficiency of data collection across the NEO robot fleet
  • Design intelligent triggers to determine when and what data should be uploaded from the robots
  • Automate ETL pipelines to make fleet-wide data easily queryable and training-ready
  • Collaborate with external dataset providers to prepare diverse multi-modal pre-training datasets
  • Build frontend tools for visualizing and automating the labeling of large datasets
  • Develop machine learning models for automatic dataset labeling and organization
What we offer
What we offer
  • Equity
  • Health, dental, and vision insurance
  • 401(k) with company match
  • Paid time off and holidays
  • Fulltime
Read More
Arrow Right

Senior AI Data Engineer

We are looking for a Senior AI Data Engineer to join an exciting project for our...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Science, Artificial Intelligence, or a related field
  • Several years of experience in AI and Machine Learning development, preferably in Customer Care solutions
  • Strong proficiency in Python and NLP frameworks
  • Hands-on experience with Azure AI services (e.g., Azure Machine Learning, Cognitive Services, Bot Services)
  • Solid understanding of cloud architectures and microservices on Azure
  • Experience with CI/CD pipelines and MLOps
  • Excellent leadership and communication skills
  • Analytical mindset with strong problem-solving abilities
  • Polish and English at a minimum B2 level.
Job Responsibility
Job Responsibility
  • Lead the development and implementation of AI-powered features for a Customer Care platform
  • Design and deploy Machine Learning and NLP models to automate customer inquiries
  • Collaborate with DevOps and cloud architects to ensure a high-performance, scalable, and secure Azure-based architecture
  • Optimize AI models to enhance customer experience
  • Integrate Conversational AI, chatbots, and language models into the platform
  • Evaluate emerging technologies and best practices in Artificial Intelligence
  • Mentor and guide a team of AI/ML developers.
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model, allowing employees to divide their time between home and modern offices in key Polish cities
  • A cafeteria system that allows employees to personalize benefits by choosing from a variety of options
  • Generous referral bonuses, offering up to PLN6,000 for referring specialists
  • Additional revenue sharing opportunities for initiating partnerships with new clients
  • Ongoing guidance from a dedicated Team Manager for each employee
  • Tailored technical mentoring from an assigned technical leader, depending on individual expertise and project needs
  • Dedicated team-building budget for online and on-site team events
  • Opportunities to participate in charitable initiatives and local sports programs
  • A supportive and inclusive work culture with an emphasis on diversity and mutual respect.
  • Fulltime
Read More
Arrow Right