CrawlJobs Logo

Team Lead - Data Engineer

cloudsonmars.com Logo

Clouds on Mars Sp. z o. o.

Location Icon

Location:
Poland , Warsaw

Category Icon

Job Type Icon

Contract Type:
B2B

Salary Icon

Salary:

Not provided

Job Description:

Looking for stable projects with modern tech and room to grow? We are Clouds on Mars, an IT consulting company specializing in architecting and delivering solutions that exceed current technological and domain standards. From cloud and data engineering through AI-driven analytics to web apps and automation – we manage the full project cycle. For those who share our mindset, we offer not just projects, but a full-time, long-term partnership.

Job Responsibility:

  • Design and deliver scalable ETL/ELT pipelines, data models, and reporting solutions using technologies such as Azure Data Factory, Synapse, Fabric, Databricks, and Power BI
  • Architect modern data platforms (data warehouse, data lake, or lakehouse) with performance, scalability, and reliability in mind
  • Collaborate with technical stakeholders to translate business requirements into robust, production-ready data solutions
  • Contribute to CI/CD processes, code quality standards, automated testing, and documentation
  • Act as a senior technical contributor across one or more client projects in a consulting setting
  • Act as direct people manager for a team of data engineers, responsible for performance reviews, development planning, and career coaching
  • Conduct regular 1:1s, provide feedback, and support skills growth through mentoring and learning opportunities
  • Balance and allocate team members across multiple projects, ensuring workload clarity and capacity planning
  • Foster a healthy, collaborative team culture by promoting open communication, continuous improvement, and psychological safety
  • Monitor team morale and resolve delivery blockers that affect individual or team performance
  • Work closely with Project Managers and Technical Leads to align team priorities and remove conflicts or inefficiencies
  • Ensure consistency in delivery standards, team engagement, and cross-project coordination
  • Support team members in navigating overlapping priorities or challenges across multiple engagements

Requirements:

  • 5+ years of experience delivering data engineering solutions in a consulting or multi-client project environment
  • Strong technical expertise with the Microsoft Azure ecosystem (Data Factory, Synapse, Fabric), Databricks, and Power BI
  • Solid background in ETL/ELT design, data integration, data modeling, and performance optimization
  • Proven people management experience, including performance evaluation, coaching, and career development
  • Excellent communication skills and ability to engage with both technical and business stakeholders
  • Fluency in English and Polish

Nice to have:

Experience in AI/ML or a strong interest in developing their skills in this area

What we offer:
  • Flexible working hours
  • Private health insurance (UNUM up to PLN 350,000, advanced up to PLN 500,000)
  • Co-funding for PZU, Medicover or LuxMed medical packages
  • Multisport card
  • Access to training platforms (MS certificates, SQL BI, internal Delivery Excellence Workshops)
  • Breakfasts, fruit, snacks, and always-chilled drinks in the office
  • Anniversary gifts and quarterly integration budgets
  • Paid referral program (up to PLN 8,000)
  • „It’s good to stay with us” – PLN 2,500 voucher after 3 years with the company

Additional Information:

Job Posted:
February 04, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Team Lead - Data Engineer

Lead Data Engineer

At Citi Fund Services is undergoing a major transformation effort to transform t...
Location
Location
United Kingdom , Belfast
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant years of hands-on experience in software development, with proven experience in data integration / data pipeline developments
  • Exceptional technical leader with a proven background in delivery of significant projects
  • Multi-year experience in Data integration development (Ab Initio, Talend, Apache spark, AWS Glue, SSIS or equivalent) including optimization, tuning and benchmarking
  • Multi-year experience in SQL Oracle, MSSQL and equivalents including optimization, tuning and benchmarking
  • Expertise with Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.) a significant plus
  • Strong understanding of Agile methodologies (Scrum, Kanban) and experience working in Agile teams
  • Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premises or public cloud (i.e., Tekton, Harness, Jenkins, etc.)
  • Demonstrable expertise in financial services considered a plus
  • Self-starter with the ability to drive projects independently and deliver results in a fast paced environment
Job Responsibility
Job Responsibility
  • Architect and develop enterprise-scale data pipelines using the latest data streaming technologies
  • Implement and optimize delivered solutions through tuning for optimal performance through frequent benchmarking
  • Develop containerised solutions capable of running in private or public cloud
  • Ensure solution is aligned to ci/cd tooling and standards
  • Ensure solution is aligned to observability standards
  • Effectively communicate technical solutions and artifacts to non-technical stakeholders and senior leadership
  • Contribute to the journey of modernizing existing data processors move to common and cloud
  • Collaborate with cross function domain experts to translate business requirements to scalable data solutions
What we offer
What we offer
  • 27 days annual leave (plus bank holidays)
  • A discretional annual performance related bonus
  • Private Medical Care & Life Insurance
  • Employee Assistance Program
  • Pension Plan
  • Paid Parental Leave
  • Special discounts for employees, family, and friends
  • Access to an array of learning and development resources
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

Data Engineering Lead a strategic professional who stays abreast of developments...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Strategic Leadership: Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Team Management: Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Architecture and Design: Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Technology Selection and Implementation: Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Performance Optimization: Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness, ensuring optimal access to global wealth data
  • Collaboration: Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions that support investment strategies and client reporting
  • Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations, particularly around sensitive financial data
  • Fulltime
Read More
Arrow Right

Big Data / Scala / Python Engineering Lead

The Applications Development Technology Lead Analyst is a senior level position ...
Location
Location
India , Chennai
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least two years (Over all 10+ hands on Data Engineering experience) of experience building and leading highly complex, technical data engineering teams
  • Lead data engineering team, from sourcing to closing
  • Drive strategic vision for the team and product
  • Experience managing an data focused product, ML platform
  • Hands on experience relevant experience in design, develop, and optimize scalable distributed data processing pipelines using Apache Spark and Scala
  • Experience managing, hiring and coaching software engineering teams
  • Experience with large-scale distributed web services and the processes around testing, monitoring, and SLAs to ensure high product quality
  • 7 to 10+ years of hands-on experience in big data development, focusing on Apache Spark, Scala, and distributed systems
  • Proficiency in Functional Programming: High proficiency in Scala-based functional programming for developing robust and efficient data processing pipelines
  • Proficiency in Big Data Technologies: Strong experience with Apache Spark, Hadoop ecosystem tools such as Hive, HDFS, and YARN
Job Responsibility
Job Responsibility
  • Partner with multiple management teams to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements
  • Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes, system processes, and industry standards
  • Provide expertise in area and advanced knowledge of applications programming and ensure application design adheres to the overall architecture blueprint
  • Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
  • Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
  • Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader i...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in computer science, Engineering, or related field
  • 7-9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark
  • Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse
  • Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices
  • Solid grasp of data governance, metadata tagging, and role-based access control
  • Proven ability to mentor and grow engineers in a matrixed or global environment
  • Strong verbal and written communication skills, with the ability to operate cross-functionally
  • Certifications in Azure, Databricks, or Snowflake are a plus
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management)
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms
  • Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines)
  • Architect data models and re-usable layers consumed by multiple downstream pods
  • Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks
  • Mentoring and coaching team
  • Partner with product and platform leaders to ensure engineering consistency and delivery excellence
  • Act as an L3 escalation point for operational data issues impacting foundational pipelines
  • Own engineering best practices, sprint planning, and quality across the Enablement pod
  • Contribute to platform discussions and architectural decisions across regions
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

Lead Data Engineer to serve as both a technical leader and people coach for our ...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or master’s degree in computer science, Engineering, or related field
  • 8-10 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark
  • Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse
  • Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices
  • Solid grasp of data governance, metadata tagging, and role-based access control
  • Proven ability to mentor and grow engineers in a matrixed or global environment
  • Strong verbal and written communication skills, with the ability to operate cross-functionally
  • Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management)
  • Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools
  • Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms
  • Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines)
  • Architect data models and re-usable layers consumed by multiple downstream pods
  • Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks
  • Mentoring and coaching team
  • Partner with product and platform leaders to ensure engineering consistency and delivery excellence
  • Act as an L3 escalation point for operational data issues impacting foundational pipelines
  • Own engineering best practices, sprint planning, and quality across the Enablement pod
  • Contribute to platform discussions and architectural decisions across regions
  • Fulltime
Read More
Arrow Right

Data Engineering Lead

The Engineering Lead Analyst is a senior level position responsible for leading ...
Location
Location
Singapore , Singapore
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10-15 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks
  • 4+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase
  • Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL
  • Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization
  • Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning)
  • Experienced in working with large and multiple datasets and data warehouses
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets
  • Strong analytic skills and experience working with unstructured datasets
  • Ability to effectively use complex analytical, interpretive, and problem-solving techniques
  • Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira
Job Responsibility
Job Responsibility
  • Define and execute the data engineering roadmap for Global Wealth Data, aligning with overall business objectives and technology strategy
  • Lead, mentor, and develop a high-performing, globally distributed team of data engineers, fostering a culture of collaboration, innovation, and continuous improvement
  • Oversee the design and implementation of robust and scalable data pipelines, data warehouses, and data lakes, ensuring data quality, integrity, and availability for global wealth data
  • Evaluate and select appropriate technologies and tools for data engineering, staying abreast of industry best practices and emerging trends specific to wealth management data
  • Continuously monitor and optimize data pipelines and infrastructure for performance, scalability, and cost-effectiveness
  • Partner with business stakeholders, data scientists, portfolio managers, and other technology teams to understand data needs and deliver effective solutions
  • Implement and enforce data governance policies and procedures to ensure data quality, security, and compliance with relevant regulations
What we offer
What we offer
  • Equal opportunity employer commitment
  • Accessibility and accommodation support
  • Global workforce benefits
  • Fulltime
Read More
Arrow Right

Software Engineer Team lead - KYC & Eligibility Team

Blockchain.com is connecting the world to the future of finance. As the most tru...
Location
Location
France , Paris
Salary
Salary:
Not provided
blockchain.com Logo
Blockchain
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience leading teams in identity verification, fraud prevention, or KYC-related engineering
  • Strong software engineering and system architecture knowledge, particularly in onboarding and verification workflows
  • Experience with JVM languages (Kotlin, Java)
  • Familiarity with KYC and identity verification providers (e.g., Sumsub, Veriff, DXC)
  • Knowledge of security, compliance, and fraud detection best practices in onboarding
  • Experience with monitoring, observability, and debugging onboarding-related issues
  • A passion for crypto and the transformations it enables
  • Proven experience mentoring and developing engineers
  • Strong stakeholder management skills, with experience working alongside Compliance, Risk, and Product teams
  • Ability to collaborate with third-party providers
Job Responsibility
Job Responsibility
  • Ensure the reliability and scalability of the KYC & Eligibility platform, optimizing identity verification processes
  • Uphold code quality standards, conduct code reviews, and ensure best practices for security and compliance
  • Collaborate with identity verification providers (Sumsub, Veriff, DXC, etc.), integrating their solutions effectively into the onboarding flow
  • Monitor onboarding performance metrics, improving verification success rates and reducing friction in the user journey
  • Guide the team in debugging and incident resolution, ensuring swift resolution of KYC-related technical issues
  • Implement and promote observability best practices
  • Mentor and develop engineers
  • Build a culture of collaboration and transparency
  • Provide constructive feedback
  • Encourage ownership and accountability
What we offer
What we offer
  • Full-time salary based on experience and meaningful equity in an industry-leading company
  • Work from Anywhere Policy: You can work remotely from anywhere in the world for up to 20 days per year
  • ClassPass
  • Budgets for learning & professional development
  • Unlimited vacation policy
  • Apple equipment
  • The opportunity to be a key player and build your career at a rapidly expanding, global technology company in an emerging field
  • Flexible work culture
  • Fulltime
Read More
Arrow Right

Team Lead Data Engineer

Data Management Platform is the core system that receives, processes and provide...
Location
Location
Salary
Salary:
Not provided
coherentsolutions.com Logo
Coherent Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Desire and readiness to perform a team lead role and a tech lead role
  • 5+ years of experience in Java
  • Strong knowledge of algorithms and data structures
  • Readiness to deep dive into legacy codebase
  • Experience with SQL DBs
  • Solid experience with Kafka, streaming systems, microservices
  • Experience in dealing with performance and high scale systems
  • Understanding of Hadoop/Spark/big data tools
  • Analytical thinking, ability to deeply investigate tasks and understand how system components works from business side
  • Reliability, confidence and readiness to deal with production issues
Job Responsibility
Job Responsibility
  • Perform the team lead / people management role for 3 our engineers: 1:1s, ensuring high motivation and retention, working with feedback, mentoring and tech support
  • Perform the tech lead role for a mixed team of +-5 customer and Coherent engineers: coordination, task distribution, technical assistance
  • End-to-end development and ownership, from design to production
  • Implement high scale Big-Data solutions and contribute to our platform infrastructure and architecture
  • Research core technologies and integrations with external APIs and services
  • Work with various stakeholders: Product, Engineering, Data providers, and etc.
  • Participate in off-hours Pager Duty
What we offer
What we offer
  • Technical and non-technical training for professional and personal growth
  • Internal conferences and meetups to learn from industry experts
  • Support and mentorship from an experienced employee to help you professional grow and development
  • Internal startup incubator
  • Health insurance
  • English courses
  • Sports activities to promote a healthy lifestyle
  • Flexible work options, including remote and hybrid opportunities
  • Referral program for bringing in new talent
  • Work anniversary program and additional vacation days
Read More
Arrow Right