CrawlJobs Logo

Senior Azure Data Engineer

lingarogroup.com Logo

Lingaro

Location Icon

Location:
Poland

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Seeking a Lead AI DevOps Engineer to oversee design and delivery of advanced AI/ML/GenAI solutions. The role combines cloud engineering and automation with hands-on leadership in deploying and integrating LLM/SLM models into enterprise applications, ensuring security, scalability, and operational excellence.

Job Responsibility:

  • Act as a senior member of the Data Science & AI Competency Center, AI Engineering team, guiding delivery and coordinating workstreams
  • Develop and execute a cloud data strategy aligned with organizational goals
  • Lead data integration efforts, including ETL processes, to ensure seamless data flow
  • Implement security measures and compliance standards in cloud environments
  • Continuously monitor and optimize data solutions for cost-efficiency
  • Establish and enforce data governance and quality standards
  • Leverage Azure services, as well as tools like dbt and Databricks, for efficient data pipelines and analytics solutions
  • Work with cross-functional teams to understand requirements and provide data solutions
  • Maintain comprehensive documentation for data architecture and solutions
  • Mentor junior team members in cloud data architecture best practices
  • Designing and implementing data processing systems on Azure platform
  • Building data pipelines to ingest data from various sources such as storages, databases, APIs, or streaming platforms
  • Support team members with troubleshooting and resolving complex technical issues and challenges
  • Provide technical guidance in data engineering and with team in selecting appropriate tools, technologies, and methodologies
  • Collaborate with stakeholders to understand project requirements, define scope, and create project plans
  • Support project managers to ensure that projects are executed effectively, meeting timelines, and quality standards
  • Act as a trusted advisor for the customer
  • Oversee the design and architecture of data solutions, collaborating with data architects and other stakeholders
  • Ensure data solutions are scalable, efficient, and aligned with business requirements
  • Align coding standards, conduct code reviews to ensure proper code quality level
  • Identify and introduce quality assurance processes for data pipelines and workflows
  • Optimize data processing and storage for performance, efficiency and cost savings
  • Evaluate and implement new technologies to improve data engineering processes on various aspects (CICD, Quality Assurance, Coding standards)
  • Maintain technical documentation of the project, control validity and perform regular reviews of it
  • Ensure compliance with security standards and regulations

Requirements:

  • At least 6 years of professional experience in the Data & Analytics area
  • 1+ years of experience (or acting as) in the Senior Consultant or above role with a strong focus on data solutions build in Azure and Databricks/Synapse/(MS Fabric is nice to have)
  • Proven experience in Azure cloud-based infrastructure, Databricks and one of SQL implementation (e.g., Oracle, T-SQL, MySQL, etc.)
  • Proficiency in programming languages such as SQL, Python, PySpark is essential (R or Scala nice to have)
  • Very good level of communication including ability to convey information clearly and specifically to co-workers and business stakeholders
  • Working experience in the agile methodologies – supporting tools (JIRA, Azure DevOps)
  • Experience in leading and managing a team of data engineers, providing guidance, mentorship, and technical support
  • Knowledge of data management principles and best practices, including data governance, data quality, and data integration
  • Good project management skills, with the ability to prioritize tasks, manage timelines, and deliver high-quality results within designated deadlines
  • Excellent problem-solving and analytical skills, with the ability to identify and resolve complex data engineering issues
  • Knowledge of data security and privacy regulations, and the ability to ensure compliance within data engineering projects
  • Knowledge of data orchestration tools
  • English at least at B2 level, ideally C1

Nice to have:

  • Experience in designing and creating integration and unit tests will be nice to have
  • Experience or familiarity with other cloud technologies, data warehouses, data governance, and business analysis is a plus
What we offer:
  • Stable employment
  • “Office as an option” model
  • Workation
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs
  • Upskilling support
  • Grow as we grow as a company
  • A diverse, inclusive, and values-driven community
  • Autonomy to choose the way you work
  • Create our community together
  • Activities to support your well-being and health
  • Social fund benefits for everyone
  • Plenty of opportunities to donate to charities and support the environment
  • Modern office equipment

Additional Information:

Job Posted:
December 09, 2025

Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Azure Data Engineer

New

Senior Data Engineer

Our client is a global jewelry manufacturer undergoing a major transformation, m...
Location
Location
Poland , Wroclaw
Salary
Salary:
Not provided
zoolatech.com Logo
Zoolatech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with proven expertise in Azure Synapse Analytics and SQL Server
  • Advanced proficiency in SQL, covering relational databases, data warehousing, dimensional modeling, and cubes
  • Practical experience with Azure Data Factory, Databricks, and PySpark
  • Track record of designing, building, and delivering production-ready data products at enterprise scale
  • Strong analytical skills and ability to translate business requirements into technical solutions
  • Excellent communication skills in English, with the ability to adapt technical details for different audiences
  • Experience working in Agile/Scrum teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable, efficient, and reusable data pipelines and products on the Azure PaaS data platform
  • Collaborate with product owners, architects, and business stakeholders to translate requirements into technical designs and data models
  • Enable advanced analytics, reporting, and other data-driven use cases that support commercial initiatives and operational efficiencies
  • Ingest, transform, and optimize large, complex data sets while ensuring data quality, reliability, and performance
  • Apply DevOps practices, CI/CD pipelines, and coding best practices to ensure robust, production-ready solutions
  • Monitor and own the stability of delivered data products, ensuring continuous improvements and measurable business benefits
  • Promote a “build-once, consume-many” approach to maximize reuse and value creation across business verticals
  • Contribute to a culture of innovation by following best practices while exploring new ways to push the boundaries of data engineering
What we offer
What we offer
  • Paid Vacation
  • Sick Days
  • Sport/Insurance Compensation
  • English Classes
  • Charity
  • Training Compensation
Read More
Arrow Right
New

Senior Data Engineer

Our client is a global jewelry manufacturer undergoing a major transformation, m...
Location
Location
Turkey , Istanbul
Salary
Salary:
Not provided
zoolatech.com Logo
Zoolatech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with proven expertise in Azure Synapse Analytics and SQL Server
  • Advanced proficiency in SQL, covering relational databases, data warehousing, dimensional modeling, and cubes
  • Practical experience with Azure Data Factory, Databricks, and PySpark
  • Track record of designing, building, and delivering production-ready data products at enterprise scale
  • Strong analytical skills and ability to translate business requirements into technical solutions
  • Excellent communication skills in English, with the ability to adapt technical details for different audiences
  • Experience working in Agile/Scrum teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable, efficient, and reusable data pipelines and products on the Azure PaaS data platform
  • Collaborate with product owners, architects, and business stakeholders to translate requirements into technical designs and data models
  • Enable advanced analytics, reporting, and other data-driven use cases that support commercial initiatives and operational efficiencies
  • Ingest, transform, and optimize large, complex data sets while ensuring data quality, reliability, and performance
  • Apply DevOps practices, CI/CD pipelines, and coding best practices to ensure robust, production-ready solutions
  • Monitor and own the stability of delivered data products, ensuring continuous improvements and measurable business benefits
  • Promote a “build-once, consume-many” approach to maximize reuse and value creation across business verticals
  • Contribute to a culture of innovation by following best practices while exploring new ways to push the boundaries of data engineering
What we offer
What we offer
  • Paid Vacation
  • Hybrid Work (home/office)
  • Sick Days
  • Sport/Insurance Compensation
  • Holidays Day Off
  • English Classes
  • Training Compensation
  • Transportation compensation
Read More
Arrow Right
New

Senior Data Engineer

As a Senior Data Engineer, you will be pivotal in designing, building, and optim...
Location
Location
United States
Salary
Salary:
102000.00 - 125000.00 USD / Year
wpromote.com Logo
Wpromote
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent practical experience
  • 4+ years of experience in data engineering or a related field
  • Intermediate to advanced programming skills in Python
  • Proficiency in SQL and experience with relational databases
  • Strong knowledge of database and data warehousing design and management
  • Strong experience with DBT (data build tool) and test-driven development practices
  • Proficiency with at least 1 cloud database (e.g. BigQuery, Snowflake, Redshift, etc.)
  • Excellent problem-solving skills, project management habits, and attention to detail
  • Advanced level Excel and Google Sheets experience
  • Familiarity with data orchestration tools (e.g. Airflow, Dagster, AWS Glue, Azure data factory, etc.)
Job Responsibility
Job Responsibility
  • Developing data pipelines leveraging a variety of technologies including dbt and BigQuery
  • Gathering requirements from non-technical stakeholders and building effective solutions
  • Identifying areas of innovation that align with existing company and team objectives
  • Managing multiple pipelines across Wpromote’s client portfolio
What we offer
What we offer
  • Half-day Fridays year round
  • Unlimited PTO
  • Extended Holiday break (Winter)
  • Flexible schedules
  • Work from anywhere options*
  • 100% paid parental leave
  • 401(k) matching
  • Medical, Dental, Vision, Life, Pet Insurance
  • Sponsored life insurance
  • Short Term Disability insurance and additional voluntary insurance
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We are seeking a highly skilled and motivated Senior Data Engineer/s to architec...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
techmahindra.com Logo
Tech Mahindra
Expiration Date
January 30, 2026
Flip Icon
Requirements
Requirements
  • 7-10 years of experience in data engineering with a focus on Microsoft Azure and Fabric technologies
  • Strong expertise in: Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks)
  • Strong expertise in: Azure Data Factory, Azure SQL, Azure Data Lake Storage Gen2
  • Strong expertise in: Power BI and/or other visualization tools
  • Strong expertise in: Azure Functions, Logic Apps, and orchestration frameworks
  • Strong expertise in: SQL, Python and PySpark/Scala
  • Experience working with structured and semi structured data (JSON, XML, CSV, Parquet)
  • Proven ability to build metadata driven architectures and reusable components
  • Strong understanding of data modeling, data governance, and security best practices
Job Responsibility
Job Responsibility
  • Design and implement ETL pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse ,warehouse, sql) and Azure Data Factory
  • Build and maintain a metadata driven Lakehouse architecture with threaded datasets to support multiple consumption patterns
  • Develop agent specific data lakes and an orchestration layer for an uber agent that can query across agents to answer customer questions
  • Enable interactive data consumption via Power BI, Azure OpenAI, and other analytics tools
  • Ensure data quality, lineage, and governance across all ingestion and transformation processes
  • Collaborate with product teams to understand data needs and deliver scalable solutions
  • Optimize performance and cost across storage and compute layers
Read More
Arrow Right
New

Senior Data Engineer

As a Senior Data Engineer at Rearc, you'll play a pivotal role in establishing a...
Location
Location
United States , New York
Salary
Salary:
160000.00 - 200000.00 USD / Year
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of professional experience in data engineering across modern cloud architectures and diverse data systems
  • Expertise in designing and implementing data warehouses and data lakes across modern cloud environments (e.g., AWS, Azure, or GCP), with experience in technologies such as Redshift, BigQuery, Snowflake, Delta Lake, or Iceberg
  • Strong Python experience for data engineering, including libraries like Pandas, PySpark, NumPy, or Dask
  • Hands-on experience with Spark and Databricks (highly desirable)
  • Experience building and orchestrating data pipelines using Airflow, Databricks, DBT, or AWS Glue
  • Strong SQL skills and experience with both SQL and NoSQL databases (PostgreSQL, DynamoDB, Redshift, Delta Lake, Iceberg)
  • Solid understanding of data architecture principles, data modeling, and best practices for scalable data systems
  • Experience with cloud provider services (AWS, Azure, or GCP) and comfort using command-line interfaces or SDKs as part of development workflows
  • Familiarity with Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, ARM/Bicep, or AWS CDK
  • Excellent communication skills, able to explain technical concepts to technical and non-technical stakeholders
Job Responsibility
Job Responsibility
  • Provide strategic data engineering leadership by shaping the vision, roadmap, and technical direction of data initiatives to align with business goals
  • Architect and build scalable, reliable data solutions, including complex data pipelines and distributed systems, using modern frameworks and technologies (e.g., Spark, Kafka, Kubernetes, Databricks, DBT)
  • Drive innovation by evaluating, proposing, and adopting new tools, patterns, and methodologies that improve data quality, performance, and efficiency
  • Apply deep technical expertise in ETL/ELT design, data modeling, data warehousing, and workflow optimization to ensure robust, high-quality data systems
  • Collaborate across teams—partner with engineering, product, analytics, and customer stakeholders to understand requirements and deliver impactful, scalable solutions
  • Mentor and coach junior engineers, fostering growth, knowledge-sharing, and best practices within the data engineering team
  • Contribute to thought leadership through knowledge-sharing, writing technical articles, speaking at meetups or conferences, or representing the team in industry conversations
What we offer
What we offer
  • Health Benefits
  • Generous time away
  • Maternity and Paternity leave
  • Educational resources and reimbursements
  • 401(k) plan with a company contribution
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

RAC are on the lookout for an experienced Senior Data Engineer to join our Data ...
Location
Location
United Kingdom , Bristol
Salary
Salary:
Not provided
rac.co.uk Logo
RAC
Expiration Date
December 30, 2025
Flip Icon
Requirements
Requirements
  • Great knowledge of technologies including but not limited to: DBT, SQL, Snowflake, Airflow, Azure Data Factory, PowerBI
  • Be able to work with minimal supervision in a dynamic and timeline sensitive work environment
  • A strong understanding of agile data development methodologies, values, and procedures
  • A thorough understanding of best practice in the data engineering lifecycle
  • Understanding of and a passion for automation
  • Strong stakeholder management, communication, organisation, and time management skills
  • Ability to help coach the team to reach their highest potential
  • “Self-motivated”, creative and efficient in proposing solutions to complex, time-critical problems
  • Be able to deal with multiple projects and deadlines
  • Strong analytical and problem-solving skills with a high attention to detail
Job Responsibility
Job Responsibility
  • Develop high quality data products and ensure that solutions built by the data team have resilience in mind
  • Implement the technical strategy to ensure our systems and architecture remain relevant and capable of meeting the demands of the business
  • Provide Knowledge management for the Data estate
  • Keep up to date emerging technologies ensuring any new tools adopted in the RAC are fit for purpose and commercially appropriate
  • Engineer in required technologies laying foundations of best practices for the wider data team to follow
  • Take a leadership role within the Data team, providing coaching and mentoring to all levels of seniority
  • Influence interface with the business and make sense of complicated or incomplete requests
What we offer
What we offer
  • Competitive salary plus automatic enrolment in our ‘Owning It Together’ Colleague Share Scheme
  • Free RAC Complete Breakdown Service from day one
  • Access to a car salary sacrifice scheme (including electric vehicle options) after 12 months
  • 25 days annual leave, plus bank holidays
  • Paid family leave, flexible schedules, and practical resources to help navigate personal commitments
  • Pension scheme with up to 6.5% matched contributions
  • Life assurance cover up to 4x salary (10x optional with flex benefits)
  • 24/7 confidential support service
  • Access Orange Savings, our exclusive discount portal with deals across top retailers, holidays, tools, tech and more
  • After passing probation, you’ll automatically join our Colleague Share Scheme
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

RAC are on the lookout for an experienced Senior Data Engineer to join our Data ...
Location
Location
United Kingdom , Walsall
Salary
Salary:
Not provided
rac.co.uk Logo
RAC
Expiration Date
December 30, 2025
Flip Icon
Requirements
Requirements
  • Great knowledge of technologies including but not limited to: DBT, SQL, Snowflake, Airflow, Azure Data Factory, PowerBI
  • Be able to work with minimal supervision in a dynamic and timeline sensitive work environment
  • A strong understanding of agile data development methodologies, values, and procedures
  • A thorough understanding of best practice in the data engineering lifecycle
  • Understanding of and a passion for automation
  • Strong stakeholder management, communication, organisation, and time management skills
  • Ability to help coach the team to reach their highest potential
  • “Self-motivated”, creative and efficient in proposing solutions to complex, time-critical problems
  • Be able to deal with multiple projects and deadlines
  • Strong analytical and problem-solving skills with a high attention to detail
Job Responsibility
Job Responsibility
  • Develop high quality data products and ensure that solutions built by the data team have resilience in mind
  • Implement the technical strategy to ensure our systems and architecture remain relevant and capable of meeting the demands of the business
  • Provide Knowledge management for the Data estate
  • Keep up to date emerging technologies ensuring any new tools adopted in the RAC are fit for purpose and commercially appropriate
  • Engineer in required technologies laying foundations of best practices for the wider data team to follow
  • Take a leadership role within the Data team, providing coaching and mentoring to all levels of seniority
  • Influence interface with the business and make sense of complicated or incomplete requests
What we offer
What we offer
  • Competitive salary
  • Automatic enrolment in ‘Owning It Together’ Colleague Share Scheme
  • Free RAC Complete Breakdown Service from day one
  • Access to a car salary sacrifice scheme (including electric vehicle options) after 12 months
  • 25 days annual leave, plus bank holidays
  • Paid family leave
  • Flexible schedules
  • Practical resources to help navigate personal commitments
  • Pension scheme with up to 6.5% matched contributions
  • Life assurance cover up to 4x salary (10x optional with flex benefits)
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.