CrawlJobs Logo

Senior Data Engineer

provectus.com Logo

Provectus

Location Icon

Location:

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
B2B

Salary Icon

Salary:

Not provided

Job Description:

Provectus helps companies adopt ML/AI to transform the ways they operate, compete, and drive value. The focus of the company is on building ML Infrastructure to drive end-to-end AI transformations, assisting businesses in adopting the right AI use cases, and scaling their AI initiatives organization-wide in such industries as Healthcare & Life Sciences, Retail & CPG, Media & Entertainment, Manufacturing, and Internet businesses. We are seeking a talented and experienced Data Engineer to join our team at Provectus. As part of our diverse practices, including Data, Machine Learning, DevOps, Application Development, and QA, you will collaborate with a multidisciplinary team of data engineers, machine learning engineers, and application developers. You will encounter numerous technical challenges and will have the opportunity to contribute to the internal solutions, engage in R&D activities, providing an excellent environment for professional growth.

Job Responsibility:

  • Collaborate closely with clients to deeply understand their existing IT environments, applications, business requirements, and digital transformation goals
  • Collect and manage large volumes of varied data sets
  • Work directly with ML Engineers to create robust and resilient data pipelines that feed Data Products
  • Define data models that integrate disparate data across the organization
  • Design, implement, and maintain ETL/ELT data pipelines
  • Perform data transformations using tools such as Spark, Trino, and AWS Athena to handle large volumes of data efficiently
  • Develop, continuously test, and deploy Data API Products with Python and frameworks like Flask or FastAPI

Requirements:

  • 5+ years of experience in data engineering
  • Experience in AWS
  • Experience handling real-time and batch data flow and data warehousing with tools and technologies like Airflow, Dagster, Kafka, Apache Druid, Spark, dbt, etc.
  • Proficiency in programming languages relevant to data engineering, such as Python and SQL
  • Proficiency with Infrastructure as Code (IaC) technologies like Terraform or AWS CloudFormation
  • Experience in building scalable APIs
  • Familiarity with Data Governance aspects like Quality, Discovery, Lineage, Security, Business Glossary, Modeling, Master Data, and Cost Optimization
  • Upper-Intermediate or higher English skills
  • Ability to take ownership, solve problems proactively, and collaborate effectively in dynamic settings

Nice to have:

  • Experience with Cloud Data Platforms (e.g., Snowflake, Databricks)
  • Experience in building Generative AI Applications (e.g., chatbots, RAG systems)
  • Relevant AWS, GCP, Azure, Databricks certifications
  • Knowledge of BI Tools (Power BI, QuickSight, Looker, Tableau, etc.)
  • Experience in building Data Solutions in a Data Mesh architecture
What we offer:
  • Long-term B2B collaboration
  • Paid vacations and sick leaves
  • Public holidays
  • Compensation for medical insurance or sports coverage
  • External and Internal educational opportunities and AWS certifications
  • A collaborative local team and international project exposure

Additional Information:

Job Posted:
December 11, 2025

Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer

New

Senior Data Platform Engineer

We are looking for an experienced data engineer to join our platform engineering...
Location
Location
United States
Salary
Salary:
141000.00 - 225600.00 USD / Year
axon.com Logo
Axon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering, software engineering with a data focus, data science, or a related role
  • Knowledge of designing data pipelines from a variety of source (e.g. streaming, flat files, APIs)
  • Proficiency in SQL and experience with relational databases (e.g., PostgreSQL)
  • Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink, Pulsar, Redpanda)
  • Strong programming skills in common data-focused languages (e.g., Python, Scala)
  • Experience with data pipeline and workflow management tools (e.g., Apache Airflow, Prefect, Temporal)
  • Familiarity with AWS-based data solutions
  • Strong understanding of data warehousing concepts and technologies (Snowflake)
  • Experience documenting data dependency maps and data lineage
  • Strong communication and collaboration skills
Job Responsibility
Job Responsibility
  • Design, implement, and maintain scalable data pipelines and infrastructure
  • Collaborate with software engineers, product managers, customer success managers, and others across the business to understand data requirements
  • Optimize and manage our data storage solutions
  • Ensure data quality, reliability, and security across the data lifecycle
  • Develop and maintain ETL processes and frameworks
  • Work with stakeholders to define data availability SLAs
  • Create and manage data models to support business intelligence and analytics
What we offer
What we offer
  • Competitive salary and 401k with employer match
  • Discretionary time off
  • Paid parental leave for all
  • Medical, Dental, Vision plans
  • Fitness Programs
  • Emotional & Development Programs
  • Snacks in our offices
Read More
Arrow Right
New

Senior ML Data Engineer

As a Senior Data Engineer, you will play a pivotal role in our AI/ML workstream,...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
awin.com Logo
Awin Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor or Master’s degree in data science, data engineering, Computer Science with focus on math and statistics / Master’s degree is preferred
  • At least 5 years experience as AI/ML data engineer undertaking above task and accountabilities
  • Strong foundation in computer science principes and statistical methods
  • Strong experience with cloud technology (AWS or Azure)
  • Strong experience with creation of data ingestion pipeline and ET process
  • Strong knowledge of big data tool such as Spark, Databricks and Python
  • Strong understanding of common machine learning techniques and frameworks (e.g. mlflow)
  • Strong knowledge of Natural language processing (NPL) concepts
  • Strong knowledge of scrum practices and agile mindset
  • Strong Analytical and Problem-Solving Skills with attention to data quality and accuracy
Job Responsibility
Job Responsibility
  • Design and maintain scalable data pipelines and storage systems for both agentic and traditional ML workloads
  • Productionise LLM- and agent-based workflows, ensuring reliability, observability, and performance
  • Build and maintain feature stores, vector/embedding stores, and core data assets for ML
  • Develop and manage end-to-end traditional ML pipelines: data prep, training, validation, deployment, and monitoring
  • Implement data quality checks, drift detection, and automated retraining processes
  • Optimise cost, latency, and performance across all AI/ML infrastructure
  • Collaborate with data scientists and engineers to deliver production-ready ML and AI systems
  • Ensure AI/ML systems meet governance, security, and compliance requirements
  • Mentor teams and drive innovation across both agentic and classical ML engineering practices
  • Participate in team meetings and contribute to project planning and strategy discussions
What we offer
What we offer
  • Flexi-Week and Work-Life Balance: We prioritise your mental health and well-being, offering you a flexible four-day Flexi-Week at full pay and with no reduction to your annual holiday allowance. We also offer a variety of different paid special leaves as well as volunteer days
  • Remote Working Allowance: You will receive a monthly allowance to cover part of your running costs. In addition, we will support you in setting up your remote workspace appropriately
  • Pension: Awin offers access to an additional pension insurance to all employees in Germany
  • Flexi-Office: We offer an international culture and flexibility through our Flexi-Office and hybrid/remote work possibilities to work across Awin regions
  • Development: We’ve built our extensive training suite Awin Academy to cover a wide range of skills that nurture you professionally and personally, with trainings conveniently packaged together to support your overall development
  • Appreciation: Thank and reward colleagues by sending them a voucher through our peer-to-peer program
Read More
Arrow Right
New

Senior Data Engineer

At ANS, the Senior Data Engineer plays a key role in delivering robust, scalable...
Location
Location
United Kingdom , Manchester
Salary
Salary:
Not provided
ans.co.uk Logo
ANS Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience in building and optimising pipelines in Azure Data Factory, Synapse, or Fabric
  • Strong knowledge of Python and SQL
  • Experience in using metadata frameworks in data engineering
  • Experience in best practice data engineering principles including CI/CD via Azure DevOps or Github
  • Understanding of Azure networking and security in relation to the data platform
  • Experience of data governance and regulation, including GDPR, principle of least privilege, classification etc.
  • Experience of lakehouse architecture, data warehousing principles, and data modelling
  • Familiarity with Microsoft Purview in a data platform context
  • Base knowledge of Azure foundry
Job Responsibility
Job Responsibility
  • Build and optimise data pipelines, notebooks, and data flows in Microsoft Fabric and Synapse Analytics, connecting to a variety of on-premises and cloud based data sources
  • Support Data Architects and Cloud Engineers by implementing solutions based on provided designs and offering feedback where needed
  • Collaborate across disciplines to ensure high-quality delivery of data solutions, including working with presales, managed services, and customer teams
  • Mentor Data engineers and support their development through guidance and task distribution
  • Ensure best practice adherence in engineering processes, including CI/CD via Azure DevOps and secure data handling (e.g. Key vault, private endpoints)
  • Contribute to Agile delivery by participating in standups, user story creation, and sprint planning
  • Document implemented solutions clearly and accurately for internal and customer use
  • Troubleshoot and resolve issues across subscriptions and environments
  • Work closely with the Project Manager (where applicable) to align on delivery timelines, report progress, and manage risks, while also acting as a key point of contact for customer SMEs and engineers to support collaboration and clarify technical requirements
  • Engage in continuous learning through certifications (e.g. DP-600 and/or DP700, AI-900, AI102, etc.) and development days
What we offer
What we offer
  • 25 days’ holiday, plus you can buy up to 5 more days
  • Birthday off
  • An extra celebration day
  • 5 days’ additional holiday in the year you get married
  • 5 volunteer days
  • Private health insurance
  • Pension contribution match and 4 x life assurance
  • Flexible working and work from anywhere for up to 30 days per year (some exceptions)
  • Maternity: 16 weeks’ full pay, Paternity: 3 weeks’ full pay, Adoption: 16 weeks’ full pay
  • Company social events
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer II

We are looking for a skilled Data Engineer to join our growing team. You will pl...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
alterdomus.com Logo
Alter Domus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • 3+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms
  • Technical Skills: Cloud & Orchestration: Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling
  • Familiarity with best practices for Airflow DAG structure, dependency management, and error handling
  • AWS Expertise: Hands-on experience with AWS Lake Formation, S3, Athena, and related services (e.g., Lambda, Glue, IAM)
  • Snowflake: Proficient in setting up data warehouses, configuring security, and optimizing queries on Snowflake
  • Data Ingestion & Transformation: Experience with Airbyte or similar tools for data ingestion
  • dbt or other SQL-based transformation frameworks for modular data processing
  • Programming: Proficiency in Python and/or Java/Scala for building data pipelines and custom integrations
  • Query Languages: Advanced knowledge of SQL for data manipulation and analysis
Job Responsibility
Job Responsibility
  • Data Pipeline Orchestration: Design, build, and maintain end-to-end data pipelines using Airflow (including managed services like Amazon MWAA) to orchestrate, schedule, and monitor batch/streaming workflows
  • Implement DAGs (Directed Acyclic Graphs) with retry logic, error handling, and alerting to ensure data quality and pipeline reliability
  • Data Ingestion & Transformation: Integrate data from various sources using Airbyte for ingestion and dbt for transformations in a scalable and modular fashion
  • Collaborate with Data Analysts and Data Scientists to implement transformations and business logic, ensuring data is analytics-ready
  • Data Modeling & Warehousing: Design and implement efficient data models for both structured and semi-structured data in AWS S3 (data lake) and Snowflake (data warehouse)
  • Ensure data schemas and transformations support advanced analytics, BI reporting, and machine learning use cases
  • Data Governance & Security: Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance
  • Work closely with IT security to establish robust encryption standards, audit trails, and identity/role-based access
  • Performance Optimization: Optimize AWS Athena queries and configurations (e.g., data partitioning) for performance and cost efficiency
  • Monitor and tune Airflow DAGs, Snowflake queries, and data transformations to improve throughput and reliability
What we offer
What we offer
  • Support for professional accreditations
  • Flexible arrangements, generous holidays, plus an additional day off for your birthday
  • Continuous mentoring along your career progression
  • Active sports, events and social committees across our offices
  • 24/7 support available from our Employee Assistance Program
  • The opportunity to invest in our growth and success through our Employee Share Plan
  • Plus additional local benefits depending on your location
Read More
Arrow Right
New

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Finland , Helsinki
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Extensive Occupational Health Care, Dental Care, as well as sports, culture, massage and lunch benefits
  • Regular office breakfast
Read More
Arrow Right
New

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Ireland , Cork
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL
  • Familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Plankton program recognizes extra work to the open source ecosystem for developers and non-developers alike
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Private medical & dental health insurance
  • Childbirth cash benefit
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We’re hiring a Senior Data Engineer to build and own critical components of our ...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects that you find meaningful outside of work - and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Professional massage at the office
  • Health and fitness benefits through Urban Sport Club membership
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.