This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are looking for a Consultant – Data Analytics with strong SQL skills and hands‑on experience in the Azure data stack to build and maintain scalable data pipelines. You will work across Azure Databricks and Azure Data Factory, ensuring reliable and well‑governed data flows that power analytics and reporting. You will be responsible for designing, implementing, and managing data ingestion, transformation, and curation pipelines within the Azure ecosystem. You will ensure data is accurate, secure, and available on time for downstream analytics. This role is ideal for someone who enjoys problem‑solving, clean design, and collaboration.
Job Responsibility:
Pipeline Development: Build and maintain scalable ETL/ELT pipelines using Azure Data Factory (ADF) and Azure Databricks (PySpark/SQL)
Data Engineering: Implement data models (Data Vault, star and snowflake), design Delta Lake tables, apply partitioning strategies, and ensure data quality through validations and reconciliation
SQL Development and Optimisation: Write efficient SQL queries, optimise performance through query tuning and index management, and maintain stored procedures
Orchestration & Monitoring: Configure triggers, implement error handling and retries, set up logging and alerting mechanisms, and support production jobs by resolving issues to ensure robust data pipeline operations
Access management: Manage access controls and permissions across Azure data services, including Databricks, Data Lake, and Data Factory
Best Practices: Follow secure development practices (Key Vault, managed identities), RBAC, tagging, and cost optimization
Collaboration: Work with BI developers, Business analysts and product teams to translate requirements into technical solutions
Documentation & Communication: Create and maintain detailed runbooks, architecture diagrams, and documentation
present updates and findings to stakeholders
Requirements:
Bachelor’s degree in Computer Science, Information Technology, Data Science, or related field
3–5 years of hands‑on experience in Data Analytics and Data Engineering
Ability to simplify complex technical issues for non‑technical audiences
Ownership, attention to detail, and a problem‑solving mindset
Strong time‑management capabilities and ability to meet deadlines
Ability to collaborate effectively in a team environment
Adaptability and willingness to learn new technologies
Nice to have:
ISO Standards Awareness: Familiarity with ISO 14001 and 45001 standards
What we offer:
A chance to join a highly ambitious and growth‑oriented team as an Individual Contributor, with opportunities to take ownership of product groups based on performance
Cross‑functional collaboration and potential for international exposure or short‑term travel
A strong environment for professional growth, continuous learning, and global collaboration
A flexible and inclusive workplace that supports work‑life balance
Competitive salary aligned with our value of Doing the Right Thing