This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
Linnify is a global technology partner that helps visionary companies accelerate their digital development initiatives. With expertise in custom software and AI development, we’ve delivered over 80 digital products across Europe and North America. We’re looking for a Lead Data Engineer to take ownership of the data architecture and infrastructure across key product initiatives. You’ll drive the evolution of our data systems, lead complex pipeline designs, and mentor engineers to deliver scalable, high-performance, and cost-efficient solutions. This is a hands-on role that blends architectural leadership with execution. This is a hybrid role based in Cluj-Napoca, offering the best of flexibility and in-person collaboration with product-minded teams.
Job Responsibility:
Design and evolve end-to-end data architectures, including ingestion, processing, storage, and analytics
Lead the development of reliable, scalable, and cost-effective data pipelines for both batch and streaming workloads
Optimize ETL/ELT workflows across multiple domains and data sources
Own and maintain orchestration systems and ensure observability, lineage, and reliability standards are met
Collaborate cross-functionally with product managers, analysts, and engineers to translate data needs into architectural plans
Ensure compliance with data governance, security, and privacy best practices
Mentor and support data engineers, review code, and contribute to the team’s technical culture
Stay ahead of trends in data tooling, cloud infrastructure, and engineering best practices
Define and track success metrics for data quality, pipeline performance, and system cost-effectiveness
Requirements:
Minimum 5+ years of experience in data engineering with strong exposure to large-scale production systems
Proven track record designing and deploying robust, cloud-native data architectures
Expert-level proficiency in Python and SQL, including best practices for testing and modularity
Solid experience with orchestration tools such as Airflow, Dagster, or Prefect
Deep familiarity with modern data platforms such as Databricks, Snowflake, BigQuery, or Palantir
In-depth knowledge of at least one major cloud provider (AWS, GCP, or Azure)
Strong understanding of streaming systems like Kafka, Pulsar, or Kinesis
Proven ability to mentor engineers and lead technical initiatives
Excellent communication skills and the ability to translate complex requirements into elegant systems
Strong leadership and mentoring abilities
Strategic thinking with hands-on implementation skills
Highly autonomous with a problem-solving mindset
Strong documentation and system-design communication
Collaborative, pragmatic, and business-aligned decision-making
Fluent in English
Nice to have:
DevOps & Observability: Terraform, GitHub Actions, DBT, DataDog, Great Expectations (nice to have)
What we offer:
Flexible work schedule and remote work days
ESOP (Employee Stock Ownership Plan) so that you grow with us
Additional loyalty vacation days (up to 28 total vacation days based on experience and time with us)
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.