This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Makersite, we're pioneering the future of sustainable product development and digital collaboration. As a leading platform for product lifecycle management (PLM), we empower companies to make smarter, more sustainable decisions across their entire supply chain. Our cutting-edge software enables teams to design, prototype, and manufacture with transparency, efficiency, and responsibility—reducing environmental impact while optimizing performance. We're a fast-growing, innovative company that thrives on creativity, collaboration, and continuous learning. If you're passionate about technology, sustainability, and creating meaningful impact, we’d love to hear from you. Join us and be a part of shaping the future of manufacturing and product innovation.
Job Responsibility:
Design, build, and maintain scalable data pipelines and infrastructure to support analytics and machine learning workflows
Develop and optimize data storage solutions (e.g., data lakes, warehouses, databases) for performance, reliability, and scalability
Ensure data quality, consistency, and accessibility through robust validation, cleaning, and transformation processes
Collaborate with data scientists, analysts, and other stakeholders to understand data needs and translate them into technical solutions
Integrate data from multiple internal and external sources, ensuring security and compliance with data governance policies
Support machine learning initiatives by preparing, structuring, and serving data for model training, testing, and deployment
Monitor, troubleshoot, and improve data systems for performance, cost-efficiency, and reliability
Document processes, data flows, and best practices to ensure maintainability and knowledge sharing across the team
Stay up to date with emerging data engineering tools, technologies, and best practices to continuously improve our data infrastructure
Requirements:
Proven experience in AI/ML concepts and applying them in real-world data workflows
5+ years of full-time commercial experience with SQL and Python (minimum 3 years considered)
Hands-on experience with data orchestration tools such as Airflow, Metaflow, or Prefect
Strong knowledge of cloud platforms, with expertise in AWS (Azure is a plus)
Solid understanding of data compliance and security best practices, particularly when working with sensitive client data
Exposure to tracing and monitoring frameworks such as Grafana, Sentry, or OpenTelemetry
Experience handling both structured data (with defined schemas and rules) and unstructured data (requiring consolidation and organization)
Residing in and legally permitted to work in the EU