CrawlJobs Logo

Senior Data Engineer

weareorbis.com Logo

Orbis Consultants

Location Icon

Location:
United States

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

A VC-backed conversational AI scale-up is expanding its engineering team and is looking for a Senior Data Engineer with deep Spark expertise to help architect and scale the data backbone behind cutting-edge AI-driven systems.

Job Responsibility:

  • Design and evolve distributed, cloud-based data infrastructure that supports both real-time and batch processing at scale
  • Build high-performance data pipelines that power analytics, AI/ML workloads, and integrations with third-party platforms
  • Champion data reliability, quality, and observability, introducing automation and monitoring across pipelines
  • Collaborate closely with engineering, product, and AI teams to deliver data solutions for business-critical initiatives

Requirements:

  • 4+ years in software development and data engineering with ownership of production-grade systems
  • Proven expertise in Spark/PySpark
  • Strong knowledge of distributed computing and modern data modeling approaches
  • Solid programming skills in Python, with an emphasis on clean, maintainable code
  • Hands-on experience with SQL and NoSQL databases (e.g., PostgreSQL, DynamoDB, Cassandra)
  • Excellent communicator who can influence and partner across teams

Nice to have:

  • Experience in high-growth, early-stage environments
  • Familiarity with MLOps and deploying ML models into production data workflows
  • A problem-solver at heart, excited by innovation and complex challenges
What we offer:

great equity

Additional Information:

Job Posted:
December 11, 2025

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer

New

Senior Data Engineer

As a senior member of our engineering team, you will take ownership of critical ...
Location
Location
Poland
Salary
Salary:
Not provided
userlane.com Logo
Userlane GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 5 years of hands-on experience in designing and developing data processing systems
  • Experience being part of a team of software engineers and helping establish processes from scratch
  • Familiarity with DBMS like ClickHouse or a different SQL-based OLAP database
  • Experience with various data engineering tools like Airflow, Kafka, dbt
  • Experience building and maintaining applications with the following languages: Python, Golang, Typescript
  • Knowledge of container technologies like Docker and Kubernetes
  • Experience with CI/CD pipelines and automated testing
  • Ability to solve problems and balance structure with creativity
  • Ability to operate independently and apply strategic thinking with technical depth
  • Willingness to share information and skills with the team
Job Responsibility
Job Responsibility
  • Shape and maintain our various data and backend components - DBs, APIs and services
  • Understand business requirements and analyze their impact on the design of our software services and tools
  • Identify architectural changes needed in our infrastructure to support a smooth process of adding new features
  • Research, propose, and deliver changes to our software architecture to address our engineering and product requirements
  • Design, develop, and maintain a solid and stable RESTful API based on industry standards and best practices
  • Collaborate with internal and external teams to deliver software that fits the overall ecosystem of our products
  • Stay up to date with the new trends and technologies that enable us to work smarter, not harder
What we offer
What we offer
  • Team & Culture: A high-performance culture with great leadership and a fun, engaged, motivated, and diverse team with people from over 20 countries
  • Market: Userlane is among the global leaders in the rapidly growing Digital Adoption industry
  • Growth: We take you and your development seriously. You can expect weekly 121s, a personalised skills assessment and development plan, on the job coaching and a budget for events and training
  • Compensation: Significant financial upside with an attractive and incentivising package on B2B basis
  • Fulltime
Read More
Arrow Right
New

Senior Crypto Data Engineer

Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate...
Location
Location
Vietnam , Hanoi
Salary
Salary:
Not provided
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • A Master's degree in a relevant field is an added advantage
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
Job Responsibility
Job Responsibility
  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We are seeking a highly skilled and motivated Senior Data Engineer/s to architec...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
techmahindra.com Logo
Tech Mahindra
Expiration Date
January 30, 2026
Flip Icon
Requirements
Requirements
  • 7-10 years of experience in data engineering with a focus on Microsoft Azure and Fabric technologies
  • Strong expertise in: Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks)
  • Strong expertise in: Azure Data Factory, Azure SQL, Azure Data Lake Storage Gen2
  • Strong expertise in: Power BI and/or other visualization tools
  • Strong expertise in: Azure Functions, Logic Apps, and orchestration frameworks
  • Strong expertise in: SQL, Python and PySpark/Scala
  • Experience working with structured and semi structured data (JSON, XML, CSV, Parquet)
  • Proven ability to build metadata driven architectures and reusable components
  • Strong understanding of data modeling, data governance, and security best practices
Job Responsibility
Job Responsibility
  • Design and implement ETL pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse ,warehouse, sql) and Azure Data Factory
  • Build and maintain a metadata driven Lakehouse architecture with threaded datasets to support multiple consumption patterns
  • Develop agent specific data lakes and an orchestration layer for an uber agent that can query across agents to answer customer questions
  • Enable interactive data consumption via Power BI, Azure OpenAI, and other analytics tools
  • Ensure data quality, lineage, and governance across all ingestion and transformation processes
  • Collaborate with product teams to understand data needs and deliver scalable solutions
  • Optimize performance and cost across storage and compute layers
Read More
Arrow Right
New

Senior Data Engineer

As a Data Software Engineer, you will leverage your skills in data science, mach...
Location
Location
United States , Arlington; Woburn
Salary
Salary:
134000.00 - 184000.00 USD / Year
str.us Logo
STR
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Ability to obtain a Top Secret (TS) security clearance, for which U.S citizenship is needed by the U.S government
  • 5+ years of experience in one or more high level programming languages, like Python
  • Experience working with large datasets for machine learning applications
  • Experience in navigating and contributing to complex, large code bases
  • Experience with containerization practices and CI/CD workflows
  • BS, MS, or PhD in a related field or equivalent experience
Job Responsibility
Job Responsibility
  • Explore a variety of data sources to build and integrate machine learning models into software
  • Develop machine learning and deep learning algorithms for large-scale multi-modal problems
  • Work with large data sets and develop software solutions for scalable analysis
  • Collaborate to create and maintain software for data pipelines, algorithms, storage, and access
  • Monitor software deployments, create logging frameworks, and design APIs
  • Build analytic tools that utilize data pipelines to provide actionable insights into customer requests
  • Develop and execute plans to improve software robustness and ensure system performance
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer - Platform Enablement

SoundCloud empowers artists and fans to connect and share through music. Founded...
Location
Location
United States , New York; Atlanta; East Coast
Salary
Salary:
160000.00 - 210000.00 USD / Year
soundcloud.com Logo
SoundCloud
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, analytics engineering, or similar roles
  • Expert-level SQL skills, including performance tuning, advanced joins, CTEs, window functions, and analytical query design
  • Proven experience with Apache Airflow (designing DAGs, scheduling, task dependencies, monitoring, Python)
  • Familiarity with event-driven architectures and messaging systems (Pub/Sub, Kafka, etc.)
  • Knowledge of data governance, schema management, and versioning best practices
  • Understanding observability practices: logging, metrics, tracing, and incident response
  • Experience deploying and managing services in cloud environments, preferably GCP, AWS
  • Excellent communication skills and a collaborative mindset
Job Responsibility
Job Responsibility
  • Develop and optimize SQL data models and queries for analytics, reporting, and operational use cases
  • Design and maintain ETL/ELT workflows using Apache Airflow, ensuring reliability, scalability, and data integrity
  • Collaborate with analysts and business teams to translate data needs into efficient, automated data pipelines and datasets
  • Own and enhance data quality and validation processes, ensuring accuracy and completeness of business-critical metrics
  • Build and maintain reporting layers, supporting dashboards and analytics tools (e.g. Looker, or similar)
  • Troubleshoot and tune SQL performance, optimizing queries and data structures for speed and scalability
  • Contribute to data architecture decisions, including schema design, partitioning strategies, and workflow scheduling
  • Mentor junior engineers, advocate for best practices and promote a positive team culture
What we offer
What we offer
  • Comprehensive health benefits including medical, dental, and vision plans, as well as mental health resources
  • Robust 401k program
  • Employee Equity Plan
  • Generous professional development allowance
  • Creativity and Wellness benefit
  • Flexible vacation and public holiday policy where you can take up to 35 days of PTO annually
  • 16 paid weeks for all parents (birthing and non-birthing), regardless of gender, to welcome newborns, adopted and foster children
  • Various snacks, goodies, and 2 free lunches weekly when at the office
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

SoundCloud is looking for a Senior Data Engineer to join our growing Content Pla...
Location
Location
Germany; United Kingdom , Berlin; London
Salary
Salary:
Not provided
soundcloud.com Logo
SoundCloud
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience in backend engineering (Scala/Go/Python) with strong design and data modeling skills
  • Hands-on experience building ETL/ELT pipelines and streaming solutions on cloud platforms (GCP preferred)
  • Proficient in SQL and experienced with relational and NoSQL databases
  • Familiarity with event-driven architectures and messaging systems (Pub/Sub, Kafka, etc.)
  • Knowledge of data governance, schema management, and versioning best practices
  • Understanding observability practices: logging, metrics, tracing, and incident response
  • Experience with containerization and orchestration (Docker, Kubernetes)
  • Experience deploying and managing services in cloud environments, preferably GCP, AWS
  • Strong collaboration skills and ability to work across backend, data, and product teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain high-performance services for content modeling, serving, and integration
  • Develop data pipelines (batch & streaming) with cloud native tools
  • Collaborate on rearchitecting the content model to support rich metadata
  • Implement APIs and data services that power internal products, external integrations, and real-time features
  • Ensure data quality, governance, and validation across ingestion, storage, and serving layers
  • Optimize system performance, scalability, and cost efficiency for both backend services and data workflows
  • Work with infrastructure-as-code (Terraform) and CI/CD pipelines for deployment and automation
  • Monitor, debug, and improve reliability using various observability tools (logging, tracing, metrics)
  • Collaborate with product leadership, music industry experts, and engineering teams across SoundCloud
What we offer
What we offer
  • Extensive relocation support including allowances, one way flights, temporary accommodation and on the ground support on arrival
  • Creativity and Wellness benefit
  • Employee Equity Plan
  • Generous professional development allowance
  • Flexible vacation and public holiday policy where you can take up to 35 days of PTO annually
  • 16 paid weeks for all parents (birthing and non-birthing), regardless of gender, to welcome newborns, adopted and foster children
  • Free German courses at beginning, intermediate and advanced
  • Various snacks, goodies, and 2 free lunches weekly when at the office
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Senior Data Engineer – Dublin (Hybrid) Contract Role | 3 Days Onsite. We are see...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
solasit.ie Logo
Solas IT Recruitment
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience as a Data Engineer working with distributed data systems
  • 4+ years of deep Snowflake experience, including performance tuning, SQL optimization, and data modelling
  • Strong hands-on experience with the Hadoop ecosystem: HDFS, Hive, Impala, Spark (PySpark preferred)
  • Oozie, Airflow, or similar orchestration tools
  • Proven expertise with PySpark, Spark SQL, and large-scale data processing patterns
  • Experience with Databricks and Delta Lake (or equivalent big-data platforms)
  • Strong programming background in Python, Scala, or Java
  • Experience with cloud services (AWS preferred): S3, Glue, EMR, Redshift, Lambda, Athena, etc.
Job Responsibility
Job Responsibility
  • Build, enhance, and maintain large-scale ETL/ELT pipelines using Hadoop ecosystem tools including HDFS, Hive, Impala, and Oozie/Airflow
  • Develop distributed data processing solutions with PySpark, Spark SQL, Scala, or Python to support complex data transformations
  • Implement scalable and secure data ingestion frameworks to support both batch and streaming workloads
  • Work hands-on with Snowflake to design performant data models, optimize queries, and establish solid data governance practices
  • Collaborate on the migration and modernization of current big-data workloads to cloud-native platforms and Databricks
  • Tune Hadoop, Spark, and Snowflake systems for performance, storage efficiency, and reliability
  • Apply best practices in data modelling, partitioning strategies, and job orchestration for large datasets
  • Integrate metadata management, lineage tracking, and governance standards across the platform
  • Build automated validation frameworks to ensure accuracy, completeness, and reliability of data pipelines
  • Develop unit, integration, and end-to-end testing for ETL workflows using Python, Spark, and dbt testing where applicable
Read More
Arrow Right
New

Senior Software Engineer, Data Platform

We are looking for a foundational member of the Data Team to enable Skydio to ma...
Location
Location
United States , San Mateo
Salary
Salary:
180000.00 - 240000.00 USD / Year
skydio.com Logo
Skydio
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience
  • 2+ years in software engineering
  • 2+ years in data engineering with a bias towards getting your hands dirty
  • Deep experience with Databricks building pipelines, managing datasets, and developing dashboards or analytical applications
  • Proven track record of operating scalable data platforms, defining company-wide patterns that ensure reliability, performance, and cost effectiveness
  • Proficiency in SQL and at least one modern programming language (we use Python)
  • Comfort working across the full data stack — from ingestion and transformation to orchestration and visualization
  • Strong communication skills, with the ability to collaborate effectively across all levels and functions
  • Demonstrated ability to lead technical direction, mentor teammates, and promote engineering excellence and best practices across the organization
  • Familiarity with AI-assisted data workflows, including tools that accelerate data transformations or enable natural-language interfaces for analytics
Job Responsibility
Job Responsibility
  • Design and scale the data infrastructure that ingests live telemetry from tens of thousands of autonomous drones
  • Build and evolve our Databricks and Palantir Foundry environments to empower every Skydian to query data, define jobs, and build dashboards
  • Develop data systems that make our products truly data-driven — from predictive analytics that anticipate hardware failures, to 3D connectivity mapping, to in-depth flight telemetry analysis
  • Create and integrate AI-powered tools for data analysis, transformation, and pipeline generation
  • Champion a data-driven culture by defining and enforcing best practices for data quality, lineage, and governance
  • Collaborate with autonomy, manufacturing, and operations teams to unify how data flows across the company
  • Lead and mentor data engineers, analysts, and stakeholders across Skydio
  • Ensure platform reliability by implementing robust monitoring, observability, and contributing to the on-call rotation for critical data systems
What we offer
What we offer
  • Equity in the form of stock options
  • Comprehensive benefits packages
  • Relocation assistance may also be provided for eligible roles
  • Paid vacation time
  • Sick leave
  • Holiday pay
  • 401K savings plan
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.