CrawlJobs Logo
Briefcase Icon
Category Icon

Databricks Engineer Jobs (Remote work)

4 Job Offers

Filters
Data Engineering Tech Lead – Azure Databricks
Save Icon
Lead our data engineering initiatives as a Tech Lead specializing in Azure Databricks. This 100% remote role in India requires 8-10 years of experience, expertise in Python/SQL, and Azure cloud. You will guide a team, design data solutions, and leverage GenAI tools while enjoying stable employmen...
Location Icon
Location
India
Salary Icon
Salary
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Junior Data Engineer (Databricks)
Save Icon
Seeking a Junior Data Engineer with 1+ year of commercial experience and hands-on knowledge of Databricks. This role, based in Poland, offers a hybrid work model (2 days remote). Join our team to build and optimize data pipelines in a modern cloud environment.
Location Icon
Location
Poland
Salary Icon
Salary
Not provided
addepto.com Logo
Addepto sp. z o.o.
Expiration Date
Until further notice
Senior Azure Data Engineer with Databricks
Save Icon
Seeking a Senior Azure Data Engineer with Databricks expertise in Poland. You will design at-scale data infrastructure, build processing patterns, and develop automated pipelines using Azure, Databricks, Spark, and Python. This role requires strong SQL, CI/CD, and data modeling skills. We offer p...
Location Icon
Location
Poland
Salary Icon
Salary
Not provided
dcg.pl Logo
DCG Sp. z o. o.
Expiration Date
Until further notice
Databricks Platform Engineer
Save Icon
Seeking a Databricks Platform Engineer in Iselin, US. You will design solutions on the Lakehouse platform using Databricks, PySpark, and Python. Key duties include integrating with enterprise tools, ensuring data quality, and supporting production. Requires expertise in cloud infrastructure, SQL ...
Location Icon
Location
United States , Iselin
Salary Icon
Salary
Not provided
signifytechnology.com Logo
Signify Technology
Expiration Date
Until further notice
Discover and apply for Databricks Engineer jobs, a pivotal role at the heart of modern data-driven enterprises. A Databricks Engineer is a specialized data professional responsible for designing, building, and maintaining scalable, reliable data and AI platforms using the Databricks Lakehouse ecosystem. This profession sits at the intersection of data engineering, cloud architecture, and data science enablement, focusing on creating the foundational infrastructure that transforms raw data into trusted, analytics-ready assets. Professionals in these roles are the architects of the data backbone, enabling everything from business intelligence and advanced analytics to machine learning and artificial intelligence. Typically, a Databricks Engineer's core responsibility is to develop and orchestrate end-to-end data pipelines. This involves implementing robust ELT (Extract, Load, Transform) or ETL processes using Apache Spark and Delta Lake, often following architectural patterns like the Medallion Architecture (Bronze, Silver, Gold layers) to systematically refine data quality. They build workflows that ingest data from diverse sources—such as enterprise applications, databases, and streaming services—into a centralized lakehouse. A significant part of the role is ensuring operational excellence: engineers implement data quality frameworks, monitoring, and alerting to guarantee pipeline reliability and performance. They also enforce data governance and security policies using tools like Unity Catalog, managing access controls, data lineage, and compliance with regulations. Furthermore, these engineers collaborate closely with data scientists and analysts by provisioning curated datasets and feature stores, and they often support the MLOps lifecycle through platforms like MLflow. They are tasked with optimizing cloud storage and compute for both cost-efficiency and high performance, requiring a deep understanding of cloud services like AWS S3, Azure Data Lake Storage, or Google Cloud Storage. Beyond technical execution, the role frequently includes creating documentation, defining best practices, and enabling other teams to use the platform effectively. The typical skill set for Databricks Engineer jobs is comprehensive. Proficiency in Databricks, Delta Lake, and Apache Spark is fundamental. Strong programming skills in Python, SQL, and/or Scala are essential for writing data transformations and automation scripts. A solid grasp of cloud infrastructure (AWS, Azure, or GCP) and Infrastructure as Code tools like Terraform is highly valued. These roles demand experience in data modeling, pipeline orchestration, and a thorough understanding of data governance principles. Successful candidates usually possess strong problem-solving abilities, a collaborative mindset, and the capacity to communicate complex technical concepts to diverse stakeholders. As organizations increasingly rely on unified data platforms, Databricks Engineer jobs offer a dynamic career path for those passionate about building the next generation of data infrastructure.

Filters

×
Countries
Category
Location
Work Mode
Salary