This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Optimove, the Data Engineer position is a central role in the Tech Org. The Data Engineering (DE) team is a Change Agent Team that plays a significant role in the ongoing (at advanced stages) migration of Optimove to cloud technologies. The ideal candidate is a senior data engineer with a strong technical background in data infrastructure, data architecture design and robust data pipes building. The candidate must also have collaborative abilities to interact effectively with Product managers, Data scientists, Onboarding engineers, and Support staff.
Job Responsibility:
Deploy and maintain critical data pipelines in production
Drive strategic technological initiatives and long-term plans from initial exploration and POC to going live in a hectic production environment
Design infrastructural data services, coordinating with the Architecture team, R&D teams, Data Scientists, and product managers to build scalable data solutions
Work in Agile process with Product Managers and other tech teams
End-to-end responsibility for the development of data crunching and manipulation processes within the Optimove product
Design and implement data pipelines and data marts
Create data tools for various teams (e.g., onboarding teams) that assist them in building, testing, and optimizing the delivery of the Optimove product
Explore and implement new data technologies to support Optimove’s data infrastructure
Work closely with the core data science team to implement and maintain ML features and tools
Requirements:
B.Sc. in Computer Science or equivalent
7+ years of extensive SQL experience (preferably working in a production environment)
Experience with programming languages (preferably, Python) – a must!
Experience with "Big Data" environments, tools, and data modeling (preferably in a production environment)
Strong capability in schema design and data modeling
Understanding of micro-services architecture
Familiarity with Airflow, ETL tools, Snowflake, and MSSQL
Quick, self-learning and good problem-solving capabilities