CrawlJobs Logo

Senior Data Engineer

appen.com Logo

Appen

Location Icon

Location:
India, Hyderabad

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We’re hiring a Senior Data Engineer with strong experience in AWS and Databricks to build scalable data solutions that power next-gen AI and machine learning. Join our fast-growing team to work on impactful projects, collaborate with top talent, and drive innovation at scale.

Job Responsibility:

  • Design, build, and manage large-scale data infrastructures using a variety of AWS technologies such as Amazon Redshift, AWS Glue, Amazon Athena, AWS Data Pipeline, Amazon Kinesis, Amazon EMR, and Amazon RDS
  • Design, develop, and maintain scalable data pipelines and architectures on Databricks using tools such as Delta Lake, Unity Catalog, and Apache Spark (Python or Scala), or similar technologies
  • Integrate Databricks with cloud platforms like AWS to ensure smooth and secure data flow across systems
  • Build and automate CI/CD pipelines for deploying, testing, and monitoring Databricks workflows and data jobs
  • Continuously optimize data workflows for performance, reliability, and security, applying Databricks best practices around data governance and quality
  • Ensure the performance, availability, and security of datasets across the organization, utilizing AWS’s robust suite of tools for data management
  • Collaborate with data scientists, software engineers, product managers, and other key stakeholders to develop data-driven solutions and models
  • Translate complex functional and technical requirements into detailed design proposals and implement them
  • Mentor junior and mid-level data engineers, fostering a culture of continuous learning and improvement within the team
  • Identify, troubleshoot, and resolve complex data-related issues
  • Champion best practices in data management, ensuring the cleanliness, integrity, and accessibility of our data
  • Optimize and fine-tune data queries and processes for performance. Evaluate and advise on technological components, such as software, hardware, and networking capabilities, for database management systems and infrastructure
  • Stay informed on the latest industry trends and technologies to ensure our data infrastructure is modern and robust.

Requirements:

  • 5-7 years of hands-on experience with AWS data engineering technologies, such as Amazon Redshift, AWS Glue, AWS Data Pipeline, Amazon Kinesis, Amazon RDS, and Apache Airflow
  • Hands-on experience working with Databricks, including Delta Lake, Apache Spark (Python or Scala), and Unity Catalog
  • Demonstrated proficiency in SQL and NoSQL databases, ETL tools, and data pipeline workflows
  • Experience with Python, and/or Java
  • Deep understanding of data structures, data modeling, and software architecture
  • Strong problem-solving skills and attention to detail
  • Self-motivated and able to work independently, with excellent organizational and multitasking skills
  • Exceptional communication skills, with the ability to explain complex data concepts to non-technical stakeholders
  • Bachelor's Degree in Computer Science, Information Systems, or a related field. A Master's Degree is preferred.

Nice to have:

Experience with AI and machine learning technologies is highly desirable.

Additional Information:

Job Posted:
December 06, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer

New

Senior Data Engineer II

We are looking for a skilled Data Engineer to join our growing team. You will pl...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
alterdomus.com Logo
Alter Domus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • 3+ years of experience as a Data Engineer or in a similar role working with cloud-based data platforms
  • Technical Skills: Cloud & Orchestration: Airflow (self-managed or managed services like Amazon MWAA) for workflow orchestration, DAG development, and scheduling
  • Familiarity with best practices for Airflow DAG structure, dependency management, and error handling
  • AWS Expertise: Hands-on experience with AWS Lake Formation, S3, Athena, and related services (e.g., Lambda, Glue, IAM)
  • Snowflake: Proficient in setting up data warehouses, configuring security, and optimizing queries on Snowflake
  • Data Ingestion & Transformation: Experience with Airbyte or similar tools for data ingestion
  • dbt or other SQL-based transformation frameworks for modular data processing
  • Programming: Proficiency in Python and/or Java/Scala for building data pipelines and custom integrations
  • Query Languages: Advanced knowledge of SQL for data manipulation and analysis
Job Responsibility
Job Responsibility
  • Data Pipeline Orchestration: Design, build, and maintain end-to-end data pipelines using Airflow (including managed services like Amazon MWAA) to orchestrate, schedule, and monitor batch/streaming workflows
  • Implement DAGs (Directed Acyclic Graphs) with retry logic, error handling, and alerting to ensure data quality and pipeline reliability
  • Data Ingestion & Transformation: Integrate data from various sources using Airbyte for ingestion and dbt for transformations in a scalable and modular fashion
  • Collaborate with Data Analysts and Data Scientists to implement transformations and business logic, ensuring data is analytics-ready
  • Data Modeling & Warehousing: Design and implement efficient data models for both structured and semi-structured data in AWS S3 (data lake) and Snowflake (data warehouse)
  • Ensure data schemas and transformations support advanced analytics, BI reporting, and machine learning use cases
  • Data Governance & Security: Utilize AWS Lake Formation APIs and best practices to maintain data security, access controls, and compliance
  • Work closely with IT security to establish robust encryption standards, audit trails, and identity/role-based access
  • Performance Optimization: Optimize AWS Athena queries and configurations (e.g., data partitioning) for performance and cost efficiency
  • Monitor and tune Airflow DAGs, Snowflake queries, and data transformations to improve throughput and reliability
What we offer
What we offer
  • Support for professional accreditations
  • Flexible arrangements, generous holidays, plus an additional day off for your birthday
  • Continuous mentoring along your career progression
  • Active sports, events and social committees across our offices
  • 24/7 support available from our Employee Assistance Program
  • The opportunity to invest in our growth and success through our Employee Share Plan
  • Plus additional local benefits depending on your location
Read More
Arrow Right
New

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Finland , Helsinki
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Extensive Occupational Health Care, Dental Care, as well as sports, culture, massage and lunch benefits
  • Regular office breakfast
Read More
Arrow Right
New

Senior Data Engineer

We’re looking for a Senior Data Engineer to join our team who shares our passion...
Location
Location
Ireland , Cork
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL
  • Familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Plankton program recognizes extra work to the open source ecosystem for developers and non-developers alike
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Private medical & dental health insurance
  • Childbirth cash benefit
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We’re hiring a Senior Data Engineer to build and own critical components of our ...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
aiven.io Logo
Aiven Deutschland GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Have a proven track record of delivering complex, large-scale data projects from design to production
  • Excel at breaking down complex business requirements into clear, actionable technical plans
  • Have built tools or systems that are essential to how your team or stakeholders work
  • Write clean, effective Python and SQL and are familiar with tools like dbt and Airflow
  • Focus on building things that last, with simple and resilient design
  • Enjoy collaborating and sharing your thinking with your peers
Job Responsibility
Job Responsibility
  • Own and deliver key data projects, collaborating closely with stakeholders from Product, Sales, Marketing, and Finance
  • Take ownership of the technical design, implementation, and maintenance of data pipelines and modeling solutions
  • Build systems that are reliable, reusable, and designed for long-term clarity
  • Contribute to architectural decisions and advocate for best practices in Python, dbt, BigQuery, and modern data tooling
  • Mentor peers through clean code, thoughtful reviews, and system design
What we offer
What we offer
  • Participate in Aiven’s equity plan
  • Hybrid work policy
  • Get the equipment you need to set yourself up for success
  • Real employer support (use one of our learning platforms, annual learning budget, and more)
  • Get holistic wellbeing support through our global Employee Assistance Program
  • Contribute to open source projects that you find meaningful outside of work - and get paid for it
  • Use up to 5 days per year to volunteer for a good cause of your choice
  • Join one of our team member resource groups
  • Professional massage at the office
  • Health and fitness benefits through Urban Sport Club membership
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We are looking for a highly skilled Senior Data Engineer to lead the design and ...
Location
Location
United Kingdom
Salary
Salary:
45000.00 - 60000.00 GBP / Year
activate-group.com Logo
Activate Group Limited
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Senior Data Engineer, BI/Data Warehouse Engineer, or similar
  • Strong hands-on expertise with Microsoft Fabric and related services
  • End-to-end DWH development experience, from ingestion to modelling and consumption
  • Strong background in data modelling, including star schema, dimensional modelling and semantic modelling
  • Experience with orchestration, monitoring and optimisation of data pipelines
  • Proficiency in SQL and strong understanding of database principles
  • Ability to design scalable data architectures aligned to business needs
Job Responsibility
Job Responsibility
  • Lead the design, architecture and build of a new enterprise data warehouse on Microsoft Fabric
  • Develop robust data pipelines, orchestration processes and monitoring frameworks using Fabric components (Data Factory, Data Engineering, Lakehouse)
  • Create scalable and high-quality data models to support analytics, Power BI reporting and self-service data consumption
  • Establish and enforce data governance, documentation and best practices across the data ecosystem
  • Collaborate with cross-functional teams to understand data needs and translate them into technical solutions
  • Provide technical leadership, mentoring and guidance to junior team members where required
What we offer
What we offer
  • 33 days holiday (including bank holidays)
  • Personal health cash plan – claim back the cost of things like dentist and optical check ups
  • Enhanced maternity / paternity / adoption / shared parental pay
  • Life assurance: three times basic salary
  • Free breakfasts and fruit
  • Birthday surprise for everybody
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

For this role, we are seeking a Senior Data Engineer for our Client's ETL Suppor...
Location
Location
India
Salary
Salary:
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • In-depth knowledge of AWS Glue, AWS Lambda, and AWS Step Functions
  • A deep understanding of ETL processes and data warehouse design
  • Proven ability to troubleshoot data pipelines and perform root cause analysis (RCA)
  • 3-5 years of relevant experience
  • Hands-on experience with Glue, Lambda, and Step Function development
  • Must be able to work a day shift that includes coverage for weekends and holidays on a rotational basis
Job Responsibility
Job Responsibility
  • Monitor approximately 2,300 scheduled jobs (daily, weekly, monthly) to ensure timely and successful execution
  • Execute on-demand jobs as required by the business
  • Troubleshoot job failures, perform detailed root cause analysis (RCA), and provide clear documentation for all findings
  • Address and resolve bugs and data-related issues reported by the business team
  • Verify source file placement in designated directories to maintain data integrity
  • Reload Change Data Capture (CDC) tables when structural changes occur in source systems
  • Help manage synchronization between external databases (including Teradata write-backs) and AWS Glue tables
  • Assist in developing new solutions, enhancements, and bug fixes using AWS Glue, Lambda, and Step Functions
  • Answer questions from the business and support User Acceptance Testing (UAT) inquiries
  • Make timely decisions to resolve issues, execute tasks efficiently, and escalate complex problems to senior or lead engineers as needed, all while maintaining agreed-upon SLAs
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.