CrawlJobs Logo

Data Engineering Tech Lead – Azure Databricks

lingarogroup.com Logo

Lingaro

Location Icon

Location:
India

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Responsibility:

  • Efficient and effective project delivery is the primary responsibility of the tech lead
  • Provide leadership and guidance to the data engineering team
  • Support team members with troubleshooting and resolving complex technical issues
  • Utilize and promote Generative AI tools to accelerate project delivery
  • Provide technical expertise and direction in data engineering
  • Collaborate with stakeholders to understand project requirements
  • Support project managers to ensure that projects are executed effectively
  • Act as a trusted advisor for the customer
  • Oversee the design and architecture of data solutions
  • Align coding standards, conduct code reviews
  • Identify and introduce quality assurance processes for data pipelines
  • Optimize data processing and storage for performance, efficiency and cost savings
  • Evaluate and implement new technologies to improve data engineering processes
  • Act as main point of contact to other teams/contributors engaged in the project
  • Maintain technical documentation of the project
  • Ensure compliance with security standards and regulations

Requirements:

  • A bachelor's or master’s degree in computer science, Information Systems, or a related field is typically required
  • Minimum of 8-10 years of experience in data engineering or a related field
  • Strong technical skills in data engineering, including proficiency in programming languages such as Python, SQL, R or Scala
  • Practical experience with Microsoft Azure cloud and Databricks platform
  • Expertise in working with various data tools and technologies, such as ETL frameworks, data pipelines, and data warehousing solutions
  • Proven experience in leading and managing a team of data engineers
  • In-depth knowledge of data management principles and best practices
  • Strong project management skills
  • Excellent problem-solving and analytical skills
  • Knowledge of data security and privacy regulations
  • Excellent communication and interpersonal skills
  • Continuous learning mindset

Nice to have:

  • Additional certifications in data integration tools or platforms are advantageous
  • Familiarity with other cloud platforms, such as GCP or AWS, is beneficial
  • Hands-on experience using GenAI tools in daily programming is highly beneficial
What we offer:
  • Stable employment
  • 100% remote
  • Flexibility regarding working hours
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
  • Grow as we grow as a company
  • A diverse, inclusive, and values-driven community
  • Autonomy to choose the way you work
  • Create our community together
  • Activities to support your well-being and health
  • Plenty of opportunities to donate to charities and support the environment

Additional Information:

Job Posted:
January 03, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineering Tech Lead – Azure Databricks

Principal Data Engineer

Our Principal Data Engineers are responsible for leading and delivering strategi...
Location
Location
United Kingdom , Bristol; London; Manchester; Swansea
Salary
Salary:
100000.00 - 115000.00 GBP / Year
madetech.com Logo
Made Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Understanding of the issues and challenges that the public sector faces in delivering services that make the best use of data and digital capabilities, transforming legacy infrastructure, and taking an innovative and user-centric approach
  • Ability to innovate and take learnings from the commercial sector, other countries and advances in technology and apply them to UK Public Sector challenges to create tangible solutions for our clients
  • Experience building trusted advisor relationships with senior client stakeholders within the public sector.
  • Experience of building and leading high performing, consulting teams and creating the leveraged engagements to provide a cost-effective, profitable, successful client-facing delivery
  • Leadership of bids and solution shaping to produce compelling proposals that help Made Tech win new business and grow the industry
  • Experience of managing third-party partnerships and suppliers (in conjunction with Made Tech colleagues) to provide a consolidated and seamless delivery team to clients.
  • Experience in delivering complex and difficult engagements that span multiple capabilities for user-facing digital and data services in the public sector
  • Experience in identifying opportunities based on client needs and developing targeted solutions to progress the development of the opportunity
  • Experience of working with sales professionals and commercial responsibility for strategic organisational goals.
  • Experience working directly with customers and users within a technology consultancy
Job Responsibility
Job Responsibility
  • Collaborate with clients to understand their needs, provide solution advice in your role as a trusted advisor and shape solutions that leverage Made Tech's wider capabilities and credentials
  • Assess project performance as a part of the billable delivery team, Quality Assure (QA) the deliverables and outcomes, and ensure client satisfaction. Coach and mentor team members as well as providing direction to enable them to achieve their engagement outcomes and to develop their careers
  • Act as a Technical Authority of the Data & AI capability to provide oversight and ensure alignment with internal and industry best practices. Ensure engagement experience is captured and used to improve standards and contribute to Made Tech knowledge
  • Participate in business development activities, including bids and pre-sales within the account, industry and practice. Coach team members on their contributions and oversee the relevant technical aspects of the proposal submission
  • Undertake people management responsibilities, including performance reviews and professional development of your engagement and practice colleagues
  • Serve as a thought leader within Made Tech, our account engagements and the wider public sector and represent the company at industry events
What we offer
What we offer
  • 30 days of paid annual leave + bank holidays
  • Flexible Parental Leave
  • Remote Working
  • Paid counselling as well as financial and legal advice
  • Flexible benefit platform which includes a Smart Tech scheme, Cycle to work scheme, and an individual benefits allowance which you can invest in a Health care cash plan or Pension plan
  • Optional social and wellbeing calendar of events
  • Fulltime
Read More
Arrow Right
New

Data Engineering Tech Lead-Azure

Location
Location
India
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A bachelor's or master’s degree in computer science, Information Systems, or a related field is typically required
  • Minimum of 8-10 years of experience in data engineering or a related field
  • Strong technical skills in data engineering, including proficiency in programming languages such as Python, SQL, R or Scala
  • Practical experience with Microsoft Azure cloud and Databricks platform
  • Expertise in working with various data tools and technologies, such as ETL frameworks, data pipelines, and data warehousing solutions
  • Proven experience in leading and managing a team of data engineers
  • In-depth knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Strong project management skills
  • Excellent problem-solving and analytical skills
  • Knowledge of data security and privacy regulations
Job Responsibility
Job Responsibility
  • Efficient and effective project delivery is the primary responsibility of the tech lead
  • Provide leadership and guidance to the data engineering team, including mentoring, coaching, and fostering a collaborative work environment
  • Set clear goals, assign tasks, and manage resources to ensure successful project delivery
  • Work closely with developers to support them and improve data engineering processes
  • Support team members with troubleshooting and resolving complex technical issues and challenges
  • Utilize and promote Generative AI tools to accelerate project delivery
  • Provide technical expertise and direction in data engineering
  • Collaborate with stakeholders to understand project requirements, define scope, and create project plans
  • Support project managers to ensure that projects are executed effectively, meeting timelines, budgets, and quality standards
  • Act as a trusted advisor for the customer
What we offer
What we offer
  • Stable employment
  • 100% remote
  • Flexibility regarding working hours
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
  • Grow as we grow as a company
  • A diverse, inclusive, and values-driven community
  • Fulltime
Read More
Arrow Right

Head of Cloud Engineering

You'll own the technical vision and delivery of our core platforms. Build world-...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
closerstillmedia.com Logo
CloserStill Media
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Senior engineering leadership experience (Head of Engineering, Engineering Manager level)
  • Proven track record building data platforms for analytics and AI/ML
  • Deep Microsoft Azure expertise
  • Hands-on experience with Databricks, APIs, and integration platforms
  • Strong understanding of MLOps and productionizing machine learning
  • Experience leading and scaling engineering teams
  • Tech stack: Azure, Databricks, Python, PySpark, SQL, JavaScript, Node, Bash
Job Responsibility
Job Responsibility
  • Define and own the technical vision and roadmap for CloserStill's core data, AI, and integration platforms
  • Architect scalable, secure, and resilient cloud-native solutions using Microsoft Azure
  • Lead development of a unified data platform supporting business intelligence, analytics, AI/ML workloads, and real-time operational use cases
  • Design and evolve an API-first integration platform enabling internal teams and external partners to build and integrate services efficiently
  • Lead implementation and optimisation of data pipelines for data engineering, analytics, and machine learning
  • Implement best practices for MLOps, including model deployment, monitoring, versioning, and lifecycle management
  • Partner closely with data science and analytics teams to ensure platforms enable experimentation and production-grade AI
  • Build, lead, and mentor high-performing engineering teams across data platform and integration domains
  • Set engineering standards for code quality, security, reliability, and performance
  • Champion DevOps and CI/CD practices to ensure fast, safe, and repeatable delivery
  • Fulltime
Read More
Arrow Right
New

Data Architect

We are seeking a Principal‑level Data Architect with deep expertise in enterpris...
Location
Location
Canada
Salary
Salary:
180000.00 - 250000.00 CAD / Year
valtech.com Logo
Valtech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering, data architecture, or platform engineering
  • Experience designing or building enterprise data platforms on at least one of Azure, GCP, or AWS and Databricks
  • Deep expertise in SQL, Python, distributed data processing, and cloud-native data design
  • Significant experience with medallion/lakehouse architecture patterns
  • Strong knowledge of modern data platforms: Databricks, Azure Synapse, Microsoft Fabric, Delta Lake, BigQuery, etc.
  • Proven experience leading architecture across large programs and multiple concurrent projects
  • Experience with enterprise automation and integration using REST APIs
  • Strong communication skills and ability to engage confidently with senior leadership and clients
  • Experience in pre-sales, technical solutioning, or client-facing architecture leadership
Job Responsibility
Job Responsibility
  • Design and own complex, enterprise-scale data architectures across MS Fabric, Azure, GCP, AWS, or Databricks serverless or hosted environments
  • Define and enforce architectural standards, patterns, and governance frameworks across ingestion, modeling, lineage, security, and orchestration
  • Shape AI‑enabled architecture approaches, including data foundations for ML, feature engineering, and low-latency operationalization pipelines
  • Act as a principal advisor to client technical leadership, helping shape long-term strategy, roadmaps, and modernization initiatives
  • Lead architectural direction during pre-sales cycles, including solutioning, scoping, estimation, and executive-level presentations
  • Anticipate downstream impacts of architectural decisions
  • maintain ownership when delivery teams or constraints require deviation from the original design
  • Architect highly available, distributed, fault‑tolerant data pipelines supporting batch and streaming workloads
  • Oversee migration and integration of complex, diverse data sources into Fabric, Azure, GCP, or Databricks platforms
  • Define medallion/lakehouse modeling patterns across Bronze/Silver/Gold zones or cloud equivalents
What we offer
What we offer
  • A comprehensive insurance plan, where you can choose the module that best suits your needs—Gold, Silver, or Bronze. The employer may contribute up to 80% of your coverage depending on the selected module. This plan includes short- and long-term disability coverage
  • Dialogue via Sun Life provides virtual healthcare services, allowing you to consult with a healthcare professional for emergencies, prescription renewals, and more. You also have access to the Employee and Family Assistance Program, as well as a complete mental health support program
  • A $500 Personal Spending Account, which can be used for healthcare reimbursements, gym memberships, public transit passes, office supplies, or contributions to your RRSP through Valtech
  • A retirement plan where Valtech will match 100% of your RRSP contributions through a Deferred Profit Sharing Plan (DPSP), up to a maximum of 4%. You can start contributing to your RRSP immediately, and to the DPSP after 3 months. The vesting of the DPSP will be after a 24 months of service
  • Access to a flexible vacation under Valtech's policy to support your work-life balance, with 5 days available during your probation period and a prorated amount calculated for the remainder of the year
  • Personal Technology Reimbursement – $30/month for every employee-offered on day 1
  • We close during the winter holidays and offer flexible scheduling throughout the year, so you can enjoy those sunny Friday afternoons—provided your weekly hours are completed
  • Fulltime
Read More
Arrow Right
New

Director Enterprise Data Architecture

The Director Enterprise Data Architecture is a highly skilled and experienced le...
Location
Location
United States , Daytona Beach; Sandy Springs
Salary
Salary:
Not provided
bbrown.com Logo
Brown & Brown UK
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in computer science or a related field (or equivalent work experience)
  • 8+ years of experience in data architecture and engineering, including enterprise-level strategy and implementation
  • 12+ years of relevant experience, with 5+ years in progressive leadership roles (preferred)
  • Hands-on experience with Azure Data Factory, Synapse Analytics, Azure Databricks, Snowflake (preferred)
  • Experience defining and implementing modern architectural patterns like Data Mesh, Data Virtualization, Lakehouse, Data Fabric, and Delta Lake
  • Familiarity with SQL, Python, and Java (preferred)
  • Agile/Scrum delivery experience (preferred)
  • Certifications such as Azure Solutions Architect Expert, Azure Data Engineer Associate, or equivalent (preferred)
  • Industry experience in insurance, financial services, or highly regulated environments (preferred)
  • Experience building globally distributed data solutions (preferred)
Job Responsibility
Job Responsibility
  • Architect the Future: Evolve the Modern Data Platform strategy, architecture, and roadmap across cloud environments, data domains, and business lines
  • Drive Enterprise Alignment: Partner with the Chief Data Officer, Data Platform, and Data & AI Enablement teams to drive adoption of our Enterprise Data Strategy
  • Enable Governance & Innovation: Establish and maintain policies, standards, and solution patterns that balance agility with compliance and scale
  • Mentor and Inspire: Provide hands-on leadership to Data Solution Engineers, division architects, and engineering teams, ensuring alignment with enterprise goals
  • Evaluate and Evolve: Continuously assess the data tech landscape and evolve our architecture to stay ahead of business needs and innovation curves
  • Champion Value Realization: Drive the development of business cases and influence investment decisions for roadmap priorities
What we offer
What we offer
  • Health Benefits: Medical/Rx, Dental, Vision, Life Insurance, Disability Insurance
  • Financial Benefits: ESPP
  • 401k
  • Student Loan Assistance
  • Tuition Reimbursement
  • Mental Health & Wellness: Free Mental Health & Enhanced Advocacy Services
  • Beyond Benefits: Paid Time Off, Holidays, Preferred Partner Discounts and more
  • Fulltime
Read More
Arrow Right

Associate Manager

Lead development and ongoing enhancement of PepsiCo’s Marketing Mix Modeling (MM...
Location
Location
United States , Purchase, New York
Salary
Salary:
169541.00 - 189000.00 USD / Year
pepsico.com Logo
Pepsico
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree (US or Foreign Equivalent) in Information Systems, Technical Project or Product Management
  • 4 years of experience in technology, technical project or product management
  • 4 years' experience managing end-to-end data product lifecycle, creating Requirements, and planning and prioritizing features using JIRA and Azure DevOps
  • 3 years' experience working with data engineers on ETL and API integrations using Snowflake and writing SQL to support advanced analytics/modeling
  • 2 years' experience collaborating with Data Science, Engineering, and Governance to write and manage user stories in Azure DevOps
  • Certified & experienced to build interactive dashboards in Power BI or Tableau for tracking metrics
  • Leading Agile Scrum ceremonies and backlog grooming for faster data product delivery
  • Advanced Analytics Platforms (Databricks, Snowflake, SQL) and Knowledge of data architecture, modeling, and engineering concepts
  • Familiarity with Azure tech stack
  • 1 year's experience implementing cleanroom tech (Habu, AWS or InfoSum) for First-Party data collaboration
Job Responsibility
Job Responsibility
  • Lead development and ongoing enhancement of PepsiCo’s Marketing Mix Modeling (MMM) data product by defining product roadmaps, conducting discovery sessions, and managing delivery across cross-functional teams
  • Manage the Sales Based Outcomes data product, ensuring it aligns with PepsiCo’s media investment strategy and supports actionable insights for ROI optimization
  • Oversee implementation and advancement of the PepsiCo Media Data Hub (MDH), leveraging cleanroom technologies, Analytics tool Databricks, BI’s Power BI, Tableau, and SQL for real-time data access and media performance analysis
  • Collaborate with stakeholders across Data Engineering, Data Scientists, and Business Insights to gather business requirements, define features, and ensure alignment with enterprise data strategy
  • Apply Agile methodologies to lead end-to-end product lifecycle management, including backlog grooming, sprint planning, and feature release coordination using tools like JIRA, Azure DevOps, and Snowflake
  • Ensure data integrity and interoperability across 40+ internal and external media sources, enabling unified performance tracking and dashboard reporting
  • Translate complex data needs into scalable product solutions that enhance campaign decision-making, measurement accuracy, and media planning efficiency
  • Provide leadership across global, cross-functional teams and foster collaboration to drive innovation and delivery of high-impact data products in a consumer goods context
  • Fulltime
Read More
Arrow Right

Director, Data Engineering & Analytics

We are seeking a Director, Data Engineering & Analytics to be a hands-on, indivi...
Location
Location
United States
Salary
Salary:
220000.00 - 235000.00 USD / Year
apogeetherapeutics.com Logo
Apogee Therapeutics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in data engineering, data platform, or analytics roles, with meaningful experience owning or leading a modern cloud data platform (Data Lakehouse or cloud data warehouse)
  • Bachelor’s degree (or equivalent professional experience) in Life Sciences, Information Systems, Engineering, or related field
  • Proven track record as a hands-on senior level individual contributor
  • Experience with cloud-based data platforms (e.g., Databricks, Snowflake, BigQuery, or similar) and modern data stack components
  • Strong expertise in data modeling and ETL/ELT for analytics and reporting
  • Experience working closely with integration teams (ideally with MuleSoft or similar)
  • Prior experience in biotech / pharma / life sciences is strongly preferred, especially supporting Clinical Development, Clinical Operations, Biometrics, or Medical Monitoring
  • Familiarity with data from CROs and clinical systems (e.g., EDC, CTMS, safety, lab, statistical outputs)
  • Experience working in regulated environments (e.g., GxP, 21 CFR Part 11, HIPAA) and supporting audit/inspection readiness is highly desirable
  • Exposure to Apogee-relevant systems is a plus (e.g., Veeva Vault applications, Egnyte GxP repositories, SAS server, NetSuite)
Job Responsibility
Job Responsibility
  • Serve as the primary owner of Apogee’s Data Lakehouse architecture, roadmap, and implementation
  • Define the data platform to support Clinical Development and Clinical Operations, with a clear path to extend to Technical Operations, Commercial, Medical Affairs, and Finance
  • Design and oversee data ingestion, transformation, and storage for structured and semi-structured data from key Apogee sources
  • Establish and maintain subject area data models and semantic layers tailored to clinical and operational use cases
  • Ensure the platform is designed and operated in alignment with Apogee’s GxP, 21 CFR Part 11, SOX Controls, and other regulatory/compliance expectations
  • Design, build, and maintain end-to-end data pipelines (ingestion, transformation, curation) into the Data Lakehouse
  • Develop and maintain data models that serve Clinical Development, Clinical Operations, and later Tech Ops and Commercial / Med Affairs / Finance
  • Implement data quality, lineage, and metadata practices
  • Contribute to defining data governance at Apogee
  • Design and build dashboards / analytic views that support key functional area use cases
What we offer
What we offer
  • Market competitive compensation and benefits package, including base salary, performance bonus, equity grant opportunities, health, welfare & retirement benefits
  • Competitive time off, including three weeks PTO, two one-week company-wide shutdowns a year and dedicated paid sick leave
  • Commitment to growing you professionally and providing access to resources to further your development
  • Apogee offers regular all team, in-person meetings to build relationships and problem solve
  • Fulltime
Read More
Arrow Right

Tech Analyst, Data & AI

Being part of Air Canada is to become part of an iconic Canadian symbol, recentl...
Location
Location
Canada , Dorval
Salary
Salary:
Not provided
aircanada.com Logo
Air Canada
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3-5 years of experience leading enterprise data warehouse development teams
  • Proven success in Agile environments and cloud-based data platforms, especially Azure and Snowflake
  • Expertise in building robust, scalable data pipelines for batch and streaming data
  • Proficiency in SQL, Python, stored procedures, and scheduling tools
  • Hands-on experience with ETL/ELT tools such as Azure Data Factory (ADF), Databricks, Snowflake, DBT and Talend
  • Skilled in implementing monitoring and alerting mechanisms for data pipelines
  • Strong capability in reviewing engineering deliverables for performance, scalability, and maintainability
  • Experience with prompt engineering and leveraging Generative AI (GenAI) to accelerate development and automate engineering workflows
  • Bachelor’s degree in Engineering, Computer Science, Mathematics, or a related field
  • Excellent communication, problem-solving, and analytical skills
Job Responsibility
Job Responsibility
  • Design & Implementation: Manage implementation of solutions for exploratory data analysis and conducts implementation assessments based on Architecture Overview Documents (AOD) and Detailed Data Solutions
  • Provide accurate effort estimations and assess feasibility of proposed architectures and data models
  • Design, undertake and analyze data to determine patterns and insights that can optimize data ingestion and data presentation
  • Pipeline Design & Development: Design and develop cost-effective, scalable ELT pipelines using reusable components and frameworks
  • Support the delivery of AI and data projects by designing and implementing streaming and/or batch pipelines
  • Design and document pipeline implementation strategies including, scheduling of jobs and workflows, dependencies, error handling, monitoring and alerting and unit testing
  • Monitor and resolve alerts/failures in prod/non-prod environments to ensure operational reliability
  • Participate in peer reviews and oversee the review and approval of code and pipelines
  • Environment Setup & Collaboration: Set up non-production and production environments including MFT flows, RBAC, cloud services, Git repositories, and defining appropriate branching strategies
  • Design, test, and maintain scalable data architectures, including databases and distributed processing systems
Read More
Arrow Right