This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Marktplaats, data is at the heart of everything we do, but Intelligence is what differentiates us. We are building the Data Platform that powers innovation across our entire Product & Technology landscape. You will join the Data Platform team—the engineering backbone of the Data & Analytics crew. We are a group of passionate engineers building the ecosystem that powers innovation across Marktplaats. Rather than just maintaining pipelines, we are architecting a high-scale environment where backend services meet analytical insights. You will be stepping into a collaborative culture that values autonomy and solving complex distributed systems problems, acting as a pivotal player in our evolution.
Job Responsibility:
Lead the evolution of our Data Platform and architect the "Data Exchange" strategy
define robust patterns for API-based ingestion, Event-Driven Architectures (Kafka), and Reverse ETL
ensure architectures are optimized for cost and performance on AWS
act as a catalyst for technical evolution
constantly scan the horizon for next-generation technologies
lead the implementation of new paradigms
design the strategy for Unity Catalog implementation and Data Contracts
champion FinOps, automating cost controls for our highest-volume workloads
build the underlying infrastructure that allows Analytics/DWH teams to run efficient transformations
elevate the technical bar of the team, mentoring Staff and Senior engineers
translate complex architectural trade-offs into clear strategies for Product Managers and Backend Leads
Requirements:
10+ years of hands-on experience in Software Development or Data Engineering
at least 5+ years specifically focused on building Data Platforms
deep understanding of how Platform infra supports Analytics workloads
proven experience evolving complex platforms from legacy patterns to modern, cloud-native solutions
deep knowledge of Spark internals, JVM tuning, and performance optimization for high-scale batch and streaming datasets
deep expertise in Unity Catalog, Delta Lake internals, and optimizing high-volume workloads
strict software engineering discipline (CI/CD, Testing, OOP) applied to data pipelines
understanding of microservices architecture
understanding the needs of Analytics/DWH teams (Data Modeling, dbt)
strong background in building automated pipelines using Terraform/Terragrunt and ensuring system observability
ability to bridge the gap between "Application Data" and "Analytical Data"
Expert level Python
Strong proficiency in Scala for backend integration
What we offer:
An attractive Base Salary
Participation in our Short Term Incentive plan (annual bonus)
Work From Anywhere: Enjoy up to 20 days a year of working from anywhere
A 24/7 Employee Assistance Program for you and your family
a collaborative environment with an opportunity to explore your potential and grow