This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Unilever is looking for a CD Excellence Associate Data Engineer to help take our B2B sales team to the next level through the implementation of AI based selling capabilities leveraging industry leading datasets and insights. By growing our structured and unstructured data and unlocking actionable insights at scale through AI, your job will be help bring our Go to Market vision to life.
Job Responsibility:
Design, build, and maintain scalable data pipelines and solutions on Microsoft Azure or similar cloud platforms
Develop and optimize ETL/ELT workflows to support high-volume, high-velocity data ingestion
Implement robust data models and structures that support analytics, reporting, and machine learning workloads
Integrate new data sources—internal and external—into the enterprise data ecosystem to expand data availability and unlock new business insights
Partner with product, engineering, business and global teams to identify opportunities for new datasets and ensure seamless onboarding
Establish scalable frameworks for data discovery, cataloging, and lineage to support enterprise wide data growth
Automate data workflows, quality checks, and monitoring using Cloud native tools and Databricks capabilities
Collaborate with data scientists to operationalize AI models using Databricks, Azure Machine Learning, or similar platforms
Ensure data readiness, reliability, and accessibility to accelerate AI adoption and experimentation
Contribute to the development of an enterprise AI strategy by identifying data gaps, opportunities, and scalable patterns
Work closely with cross functional teams to translate business requirements into scalable data solutions
Provide technical guidance and best practices on Azure and Databricks to engineering and analytics teams
Participate in code reviews, architecture discussions, and continuous improvement initiatives
Assist in identifying & defining new systems functionality within the Go To Market technology stack
Improve processes/workflows within the Sales organization in regard to Sales applications and system support
User support - troubleshooting, identifying problems and working with Local & Global IT to resolve technical issues and work with users to provide proper training
Assist in the analysis of underlying system issues arising from investigations into requirements and problems, and identify available solutions for consideration
Manage multiple deliverables in a fast paced and ever-changing business environment
Expanding and improving an industry leading dataset
Accelerating business value creation by unlocking AI capabilities
Make recommendations on the systems, processes, and services that will help facilitate the go to market organizations achieving our business objectives
Requirements:
Technical aptitude and the ability to drive business value through focused technology solutions
Deep hands-on experience with cloud services such as Azure data services (e.g., Data Factory, Databricks, ADLS, Synapse, Azure SQL)
Strong proficiency in building scalable ETL/ELT pipelines using Databricks (APIs, PySpark, Spark SQL, Delta Lake)
Solid understanding of distributed computing, data lakehouse architecture, Unity Catalog and modern data engineering patterns
Ability to design and optimize data models that support analytics, reporting, and machine learning workloads
Strong SQL and Python skills, with the ability to write clean, efficient, production ready code
Proven ability to onboard new data sources, integrate APIs, and work with structured, semi-structured data
Experience designing frameworks for data ingestion, metadata management, and data lineage
Comfort working with large-scale datasets and evolving data ecosystems
Experience automating data workflows, quality checks, and monitoring using Azure-native tools
Familiarity with CI/CD practices for data engineering (e.g., GitHub Actions, Azure DevOps)
Ability to build resilient, self-healing pipelines that minimize manual intervention
Strong focus on performance tuning, cost optimization, and operational reliability
Ability to build and maintain feature pipelines that support ML and AI initiatives
Experience collaborating with data scientists to operationalize models in Databricks or Azure ML
Understanding of how data quality, structure, and availability impact AI outcomes
Curiosity and initiative to identify new data opportunities that unlock AI use cases
Strong ability to turning data into insights and communicate actionable business narratives
Ability to quickly adapt new AI technologies around LLMs, MCPs, prompt engineering and advanced data modelling
Interest in developing a deep understanding of the Foodservice industry and its go to market use cases
Ability to lead and execute multiple projects simultaneously
Strong business partnering and communication skills
What we offer:
Bonus eligible
Long-Term Incentive (LTI) eligible
Eligible to participate in benefits plan which may include health insurance (including prescription drug, dental, and vision coverage), retirement savings benefits, life insurance and disability benefits, parental leave, sick leave, paid vacation and holidays, as well as access to numerous voluntary benefits