This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Our client partners with high-growth startups to build operational systems that transform how teams work. They focus on tooling, workflows and data infrastructure — treating operations as a product, designing systems that move data, automate work and scale reliably. For this role, you’ll be part of the data & engineering effort that underpins those operational systems, contributing to ETL pipelines, data orchestration and typescript development.
Job Responsibility:
Design, build and maintain ETL/ELT pipelines to collect, transform and load data from multiple sources into a central data layer (data warehouse, data-lake or equivalent)
Work end-to-end: from data ingestion and schema design, through transformation/cleaning, to enabling downstream analytics, dashboards and workflows
Write robust, well-tested code in TypeScript (and possibly other languages/tools) for data & pipeline orchestration
Partner with cross-functional stakeholders (ops, analytics, engineering) to understand data needs, define requirements and ensure data quality, reliability and availability
Monitor data pipelines, set up alerting, ensure data accuracy, performance and scalability as the systems grow
Collaborate in building the infrastructure for data as a source of truth, enabling the broader internal tooling and operational systems the company delivers
Contribute to design, architecture and best practices around data workflows, including performance, versioning, observability, documentation and automation
Operate in a remote, high-autonomy context, often with startup clients and varied tooling environments
Requirements:
Proven experience building ETL/ELT pipelines in a production environment (designing ingestion + transformation + loading + monitoring)
Strong proficiency in TypeScript (experience writing backend/data-oriented TS code)
Solid understanding of data engineering concepts: data modelling, schema design, warehousing, data quality, scheduling, monitoring, error handling
Familiarity with modern data tooling/stacks (for example: cloud data warehouses such as Snowflake, BigQuery, Redshift
orchestration tools
ETL frameworks
job scheduling
APIs/webhooks)
Excellent problem solving — comfortable working with messy data, identifying bottlenecks, designing scalable, maintainable solutions
Comfortable working remotely, collaborating asynchronously and autonomously across time zones
Based in Europe (time-zone compatible with Europe
remote but must reside in Europe)
Nice to have:
Experience working in a startup or dynamic environment where rapid iteration and high autonomy are expected
Familiarity with internal-tooling or operations automation stacks (e.g., connecting CRMs, Slack, billing systems, data warehouses)
Experience in typescript CI/CD, strong engineering discipline around tests, code review, observability
Experience with data visualisation, dashboards, or analytics (so data flows serve downstream users)
Experience with orchestration tools (e.g., Airflow, dbt, Prefect, Dagster) or cloud job frameworks
Experience integrating disparate data sources, APIs/webhooks, real-time or near-real-time data movement
What we offer:
Flexible remote work from Europe — you choose your location and working hours (within reason for team overlap)
Opportunity to work across high-impact projects with fast-growing startups and help build systems that scale
High autonomy and opportunity for ownership: you’ll be shaping how the data layer supports both internal tooling and external client operations
Exposure to modern tooling, workflows and operational systems built at scale