This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
A large, international company in the engineering sector is scaling production-grade AI across the organization. The Data & AI Platform team is expanding delivery capacity and is looking for an interim Data & AI Engineer who can step in immediately, take ownership of AI use cases end-to-end, and accelerate adoption across business teams on an Azure + Databricks foundation. This is a hands-on delivery role with high autonomy: you will build, ship, operate, and improve AI solutions in production—while introducing reusable components and pragmatic best practices that raise the bar across the platform.
Job Responsibility:
Deliver AI use cases end-to-end: from ingestion and feature engineering to model/agent development and production rollout
Design and operate Databricks lakehouse pipelines (batch and streaming) using Spark/SQL/Delta Lake, including monitoring and data quality controls
Build AI solutions on the platform, including: RAG patterns (retrieval, chunking, embeddings, evaluation), tool-using agents and orchestration approaches, prompt strategies and testing/guardrails, (where relevant) custom ML models and supporting pipelines
Productionize and run what you build: reliability, observability, cost control, and operational hygiene
Enable other teams by creating reusable components, templates, and delivery standards
Work with governance and compliance: align with AI governance requirements and ensure solutions are secure and auditable
Collaborate with stakeholders across IT and the business to translate needs into working solutions and clear delivery increments
Requirements:
Proven experience as a Data Engineer / Data & AI Engineer delivering solutions into production environments
Strong hands-on Databricks expertise: Spark/SQL, Delta Lake, Jobs/Workflows, performance tuning
Strong Python + SQL for data engineering and AI/ML workflows
Experience building data pipelines with quality checks and operational monitoring
Practical experience with LLM-based solutions (RAG and/or agents), including prompt strategies and evaluation approaches
Comfortable working independently in an interim context: you can own delivery, communicate clearly, and unblock yourself