CrawlJobs Logo

Senior Data Engineer

bluemargin.com Logo

Blue Margin

Location Icon

Location:
United States, Fort Collins

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

110000.00 - 140000.00 USD / Year

Job Description:

At Blue Margin, we are on a mission to build the go-to data platform for PE-backed mid-market companies. We are a dynamic, customer-focused company providing hosted data platforms for companies across many industries. We’re passionate about leveraging the power of data to drive business success for our clients. We help mid-market companies turn their data into a strategic asset. Our clients rely on us to design and deliver reporting platforms that fuel better, faster decision-making. We’re passionate about helping clients increase company value through better analysis and decision-making, and we’re looking for a Senior Data Engineer to strengthen our team. As a Senior Data Engineer, you will lead the design, optimization, and scalability of data platforms that power analytics for our clients. You will be hands-on with data pipelines, large-scale data processing, and modern cloud data stacks while mentoring team members and helping shape best practices. This role requires strong expertise in Python (PySpark/Apache Spark), deep knowledge of working with high-volume data, and experience optimizing Delta Lake–based architectures. Exposure to Snowflake or Microsoft Fabric, and tools like Fivetran, Azure Data Factory, and Synapse Pipelines, is highly valued. If you’re motivated by solving complex data challenges, thrive in a collaborative environment, and enjoy applying AI to increase engineering productivity, this role offers the opportunity to have significant technical and strategic impact.

Job Responsibility:

  • Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools
  • Drive efficiency in incremental/delta data loading, partitioning, and performance tuning
  • Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments
  • Collaborate with stakeholders and analysts to translate business needs into scalable data solutions
  • Evaluate and incorporate AI/automation to improve development speed, testing, and data quality
  • Oversee and mentor junior data engineers, establishing coding standards and best practices
  • Ensure high standards for data quality, security, and governance
  • Participate in solution design for client engagements, balancing technical depth with practical outcomes

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark
  • Proven ability to manage large datasets and optimize for speed, scalability, and reliability
  • Strong SQL skills and understanding of relational and distributed data systems
  • Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake
  • Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices
  • Familiarity with CI/CD, version control, and DevOps practices for data pipelines
  • Experience leveraging AI-assisted tools to accelerate engineering workflows
  • Strong communication skills
  • ability to convey complex technical details to both engineers and business stakeholders

Nice to have:

Relevant certifications (Azure, Snowflake, or Fabric) a plus

What we offer:
  • Competitive pay
  • strong benefits
  • flexible hybrid work setup

Additional Information:

Job Posted:
December 06, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer

New

Senior Data Engineer II

We are looking for a skilled Data Engineer to join our growing team. You will pl...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
alterdomus.com Logo
Alter Domus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • 3+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms
  • Technical Skills: Cloud & Orchestration: Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling
  • Familiarity with best practices for Airflow DAG structure, dependency management, and error handling
  • AWS Expertise: Hands-on experience with AWS Lake Formation, S3, Athena, and related services (e.g., Lambda, Glue, IAM)
  • Snowflake: Proficient in setting up data warehouses, configuring security, and optimizing queries on Snowflake
  • Data Ingestion & Transformation: Experience with Airbyte or similar tools for data ingestion
  • dbt or other SQL-based transformation frameworks for modular data processing
  • Programming: Proficiency in Python and/or Java/Scala for building data pipelines and custom integrations
  • Query Languages: Advanced knowledge of SQL for data manipulation and analysis
Job Responsibility
Job Responsibility
  • Data Pipeline Orchestration: Design, build, and maintain end-to-end data pipelines using Airflow (including managed services like Amazon MWAA) to orchestrate, schedule, and monitor batch/streaming workflows
  • Implement DAGs (Directed Acyclic Graphs) with retry logic, error handling, and alerting to ensure data quality and pipeline reliability
  • Data Ingestion & Transformation: Integrate data from various sources using Airbyte for ingestion and dbt for transformations in a scalable and modular fashion
  • Collaborate with Data Analysts and Data Scientists to implement transformations and business logic, ensuring data is analytics-ready
  • Data Modeling & Warehousing: Design and implement efficient data models for both structured and semi-structured data in AWS S3 (data lake) and Snowflake (data warehouse)
  • Ensure data schemas and transformations support advanced analytics, BI reporting, and machine learning use cases
  • Data Governance & Security: Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance
  • Work closely with IT security to establish robust encryption standards, audit trails, and identity/role-based access
  • Performance Optimization: Optimize AWS Athena queries and configurations (e.g., data partitioning) for performance and cost efficiency
  • Monitor and tune Airflow DAGs, Snowflake queries, and data transformations to improve throughput and reliability
What we offer
What we offer
  • Support for professional accreditations
  • Flexible arrangements, generous holidays, plus an additional day off for your birthday
  • Continuous mentoring along your career progression
  • Active sports, events and social committees across our offices
  • 24/7 support available from our Employee Assistance Program
  • The opportunity to invest in our growth and success through our Employee Share Plan
  • Plus additional local benefits depending on your location
Read More
Arrow Right
New

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Finland , Helsinki
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Extensive Occupational Health Care, Dental Care, as well as sports, culture, massage and lunch benefits
  • Regular office breakfast
Read More
Arrow Right
New

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Ireland , Cork
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL
  • Familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Plankton program recognizes extra work to the open source ecosystem for developers and non-developers alike
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Private medical & dental health insurance
  • Childbirth cash benefit
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We’re hiring a Senior Data Engineer to build and own critical components of our ...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects that you find meaningful outside of work - and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Professional massage at the office
  • Health and fitness benefits through Urban Sport Club membership
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We are looking for a highly skilled Senior Data Engineer to lead the design and ...
Location
Location
United Kingdom
Salary
Salary:
45000.00 - 60000.00 GBP / Year
activate-group.com Logo
Activate Group Limited
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Senior Data Engineer, BI/Data Warehouse Engineer, or similar
  • Strong hands-on expertise with Microsoft Fabric and related services
  • End-to-end DWH development experience, from ingestion to modelling and consumption
  • Strong background in data modelling, including star schema, dimensional modelling and semantic modelling
  • Experience with orchestration, monitoring and optimisation of data pipelines
  • Proficiency in SQL and strong understanding of database principles
  • Ability to design scalable data architectures aligned to business needs
Job Responsibility
Job Responsibility
  • Lead the design, architecture and build of a new enterprise data warehouse on Microsoft Fabric
  • Develop robust data pipelines, orchestration processes and monitoring frameworks using Fabric components (Data Factory, Data Engineering, Lakehouse)
  • Create scalable and high-quality data models to support analytics, Power BI reporting and self-service data consumption
  • Establish and enforce data governance, documentation and best practices across the data ecosystem
  • Collaborate with cross-functional teams to understand data needs and translate them into technical solutions
  • Provide technical leadership, mentoring and guidance to junior team members where required
What we offer
What we offer
  • 33 days holiday (including bank holidays)
  • Personal health cash plan – claim back the cost of things like dentist and optical check ups
  • Enhanced maternity / paternity / adoption / shared parental pay
  • Life assurance: three times basic salary
  • Free breakfasts and fruit
  • Birthday surprise for everybody
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

For this role, we are seeking a Senior Data Engineer for our Client's ETL Suppor...
Location
Location
India
Salary
Salary:
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • In-depth knowledge of AWS Glue, AWS Lambda, and AWS Step Functions
  • A deep understanding of ETL processes and data warehouse design
  • Proven ability to troubleshoot data pipelines and perform root cause analysis (RCA)
  • 3-5 years of relevant experience
  • Hands-on experience with Glue, Lambda, and Step Function development
  • Must be able to work a day shift that includes coverage for weekends and holidays on a rotational basis
Job Responsibility
Job Responsibility
  • Monitor approximately 2,300 scheduled jobs (daily, weekly, monthly) to ensure timely and successful execution
  • Execute on-demand jobs as required by the business
  • Troubleshoot job failures, perform detailed root cause analysis (RCA), and provide clear documentation for all findings
  • Address and resolve bugs and data-related issues reported by the business team
  • Verify source file placement in designated directories to maintain data integrity
  • Reload Change Data Capture (CDC) tables when structural changes occur in source systems
  • Help manage synchronization between external databases (including Teradata write-backs) and AWS Glue tables
  • Assist in developing new solutions, enhancements, and bug fixes using AWS Glue, Lambda, and Step Functions
  • Answer questions from the business and support User Acceptance Testing (UAT) inquiries
  • Make timely decisions to resolve issues, execute tasks efficiently, and escalate complex problems to senior or lead engineers as needed, all while maintaining agreed-upon SLAs
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.