CrawlJobs Logo

Senior Analyst - Data Engineer

pumaenergy.com Logo

Puma Energy

Location Icon

Location:
India , Mumbai

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Collaborate with data scientists and business stakeholders to design, develop, and maintain efficient data pipelines feeding into the organization's data lake. Maintain the integrity and quality of the data lake, enabling accurate and actionable insights for data scientists and informed decision-making for business stakeholders. Utilize extensive knowledge of data engineering and cloud technologies to enhance the organization’s data infrastructure, promoting a culture of data-driven decision-making. Apply data engineering expertise to define and optimize data pipelines using advanced concepts to improve the efficiency and accessibility of data storage. Own the development of an extensive data catalog, ensuring robust data governance and facilitating effective data access and utilization across the organization.

Job Responsibility:

  • Contribute to the development of scalable and performant data pipelines on Databricks, leveraging Delta Lake, Delta Live Tables (DLT), and other core Databricks components
  • Develop data lakes/warehouses designed for optimized storage, querying, and real-time updates using Delta Lake
  • Implement effective data ingestion strategies from various sources (streaming, batch, API-based), ensuring seamless integration with Databricks
  • Ensure the integrity, security, quality, and governance of data across our Databricks-centric platforms
  • Collaborate with stakeholders (data scientists, analysts, product teams) to translate business requirements into Databricks-native data solutions
  • Build and maintain ETL/ELT processes, heavily utilizing Databricks, Spark (Scala or Python), SQL, and Delta Lake for transformations
  • Monitor and optimize the cost-efficiency of data operations on Databricks, ensuring optimal resource utilization
  • Utilize a range of Databricks tools, including the Databricks CLI and REST API, alongside Apache Spark™, to develop, manage, and optimize data engineering solutions

Requirements:

  • 5 years of overall experience & at least 3 years of relevant experience
  • 3 years of experience working with Azure or any cloud platform & Databricks
  • Proficiency in Spark, Delta Lake, Structured Streaming, and other Azure Databricks functionalities for sophisticated data pipeline construction
  • Strong capability in diagnosing and optimizing Spark applications and Databricks workloads, including strategic cluster sizing and configuration
  • Expertise in sharing data solutions that leverage Azure Databricks ecosystem technologies for enhanced data management and processing efficiency
  • Profound knowledge of data governance, data security, coupled with an understanding of large-scale distributed systems and cloud architecture design
  • Experience with a variety of data sources and BI tools
  • Experience with CI/CD and DevOps practices specifically tailored for the Databricks environment

Additional Information:

Job Posted:
January 09, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Analyst - Data Engineer

Senior Data Engineer

At Ingka Investments (Part of Ingka Group – the largest owner and operator of IK...
Location
Location
Netherlands , Leiden
Salary
Salary:
Not provided
https://www.ikea.com Logo
IKEA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Formal qualifications (BSc, MSc, PhD) in computer science, software engineering, informatics or equivalent
  • Minimum 3 years of professional experience as a (Junior) Data Engineer
  • Strong knowledge in designing efficient, robust and automated data pipelines, ETL workflows, data warehousing and Big Data processing
  • Hands-on experience with Azure data services like Azure Databricks, Unity Catalog, Azure Data Lake Storage, Azure Data Factory, DBT and Power BI
  • Hands-on experience with data modeling for BI & ML for performance and efficiency
  • The ability to apply such methods to solve business problems using one or more Azure Data and Analytics services in combination with building data pipelines, data streams, and system integration
  • Experience in driving new data engineering developments (e.g. apply new cutting edge data engineering methods to improve performance of data integration, use new tools to improve data quality and etc.)
  • Knowledge of DevOps practices and tools including CI/CD pipelines and version control systems (e.g., Git)
  • Proficiency in programming languages such as Python, SQL, PySpark and others relevant to data engineering
  • Hands-on experience to deploy code artifacts into production
Job Responsibility
Job Responsibility
  • Contribute to the development of D&A platform and analytical tools, ensuring easy and standardized access and sharing of data
  • Subject matter expert for Azure Databrick, Azure Data factory and ADLS
  • Help design, build and maintain data pipelines (accelerators)
  • Document and make the relevant know-how & standard available
  • Ensure pipelines and consistency with relevant digital frameworks, principles, guidelines and standards
  • Support in understand needs of Data Product Teams and other stakeholders
  • Explore ways create better visibility on data quality and Data assets on the D&A platform
  • Identify opportunities for data assets and D&A platform toolchain
  • Work closely together with partners, peers and other relevant roles like data engineers, analysts or architects across IKEA as well as in your team
What we offer
What we offer
  • Opportunity to develop on a cutting-edge Data & Analytics platform
  • Opportunities to have a global impact on your work
  • A team of great colleagues to learn together with
  • An environment focused on driving business and personal growth together, with focus on continuous learning
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Software Engineer, you will play a key role in designing and buildin...
Location
Location
United States
Salary
Salary:
156000.00 - 195000.00 USD / Year
apollo.io Logo
Apollo.io
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Excellent communication skills
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open
Job Responsibility
Job Responsibility
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing
What we offer
What we offer
  • Equity
  • Company bonus or sales commissions/bonuses
  • 401(k) plan
  • At least 10 paid holidays per year
  • Flex PTO
  • Parental leave
  • Employee assistance program and wellbeing benefits
  • Global travel coverage
  • Life/AD&D/STD/LTD insurance
  • FSA/HSA and medical, dental, and vision benefits
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Fospha is dedicated to building the world's most powerful measurement solution f...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
blenheimchalcot.com Logo
Blenheim Chalcot
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Excellent knowledge of PostgreSQL and SQL technologies
  • Fluent in Python
  • Understanding of data architecture, pipelines and ELT flows/ technology/ methodologies
  • Understanding of agile methodologies and practices
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field
Job Responsibility
Job Responsibility
  • Implement and maintain ELT (Extract, Load, Transform) processes using scalable data pipelines and data architecture
  • Collaborate with cross-functional teams to understand data requirements and deliver effective solutions
  • Ensure data integrity and quality across various data sources
  • Support data-driven decision-making by providing clean, reliable, and timely data
  • Define the standards for high-quality data for Data Science and Analytics use-cases and help shape the data roadmap for the domain
  • Design, develop, and maintain the data models used by ML Engineers, Data Analysts and Data Scientists to access data
  • Conduct exploratory data analysis to uncover data patterns and trends
  • Identify opportunities for process improvement and drive continuous improvement in data operations
  • Stay updated on industry trends, technologies, and best practices in data engineering
What we offer
What we offer
  • Competitive salary
  • Be part of a leading global venture builder, Blenheim Chalcot, and learn from the incredible talent in BC
  • Be exposed to the right mix of challenges and learning and development opportunities
  • Flexible Benefits including Private Medical and Dental, Gym Subsidiaries, Life Assurance, Pension scheme etc.
  • 25 days of paid holiday + your birthday off
  • Free snacks in the office
  • Quarterly team socials
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Blue Margin, we are on a mission to build the go-to data platform for PE-back...
Location
Location
United States , Fort Collins
Salary
Salary:
110000.00 - 140000.00 USD / Year
bluemargin.com Logo
Blue Margin
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field
  • 5+ years of professional experience in data engineering, with emphasis on Python & PySpark/Apache Spark
  • Proven ability to manage large datasets and optimize for speed, scalability, and reliability
  • Strong SQL skills and understanding of relational and distributed data systems
  • Experience with Azure Data Factory, Synapse Pipelines, Fivetran, Delta Lake, Microsoft Fabric, or Snowflake
  • Knowledge of data modeling, orchestration, and Delta/Parquet file management best practices
  • Familiarity with CI/CD, version control, and DevOps practices for data pipelines
  • Experience leveraging AI-assisted tools to accelerate engineering workflows
  • Strong communication skills
  • ability to convey complex technical details to both engineers and business stakeholders
Job Responsibility
Job Responsibility
  • Architect, design, and optimize large-scale data pipelines using tools like PySpark, SparkSQL, Delta Lake, and cloud-native tools
  • Drive efficiency in incremental/delta data loading, partitioning, and performance tuning
  • Lead implementations across Azure Synapse, Microsoft Fabric, and/or Snowflake environments
  • Collaborate with stakeholders and analysts to translate business needs into scalable data solutions
  • Evaluate and incorporate AI/automation to improve development speed, testing, and data quality
  • Oversee and mentor junior data engineers, establishing coding standards and best practices
  • Ensure high standards for data quality, security, and governance
  • Participate in solution design for client engagements, balancing technical depth with practical outcomes
What we offer
What we offer
  • Competitive pay
  • strong benefits
  • flexible hybrid work setup
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Data Engineer at Corporate Tools, you will work closely with our Sof...
Location
Location
United States
Salary
Salary:
150000.00 USD / Year
corporatetools.com Logo
Corporate Tools
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s (BA or BS) in computer science, or related field
  • 2+ years in a full stack development role
  • 4+ years of experience working in a data engineer role, or related position
  • 2+ years of experience standing up and maintaining a Redshift warehouse
  • 4+ years of experience with Postgres, specifically with RDS
  • 4+ years of AWS experience, specifically S3, Glue, IAM, EC2, DDB, and other related data solutions
  • Experience working with Redshift, DBT, Snowflake, Apache Airflow, Azure Data Warehouse, or other industry standard big data or ETL related technologies
  • Experience working with both analytical and transactional databases
  • Advanced working SQL (Preferably PostgreSQL) knowledge and experience working with relational databases
  • Experience with Grafana or other monitoring/charting systems
Job Responsibility
Job Responsibility
  • Focus on data infrastructure. Lead and build out data services/platforms from scratch (using OpenSource tech)
  • Creating and maintaining transparent, bulletproof ETL (extract, transform, and load) pipelines that cleans, transforms, and aggregates unorganized and messy data into databases or data sources
  • Consume data from roughly 40 different sources
  • Collaborate closely with our Data Analysts to get them the data they need
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
  • Improve existing data models while implementing new business capabilities and integration points
  • Creating proactive monitoring so we learn about data breakages or inconsistencies right away
  • Maintaining internal documentation of how the data is housed and transformed
  • Improve existing data models, and design new ones to meet the needs of data consumers across Corporate Tools
  • Stay current with latest cloud technologies, patterns, and methodologies
What we offer
What we offer
  • 100% employer-paid medical, dental and vision for employees
  • Annual review with raise option
  • 22 days Paid Time Off accrued annually, and 4 holidays
  • After 3 years, PTO increases to 29 days. Employees transition to flexible time off after 5 years with the company—not accrued, not capped, take time off when you want
  • The 4 holidays are: New Year’s Day, Fourth of July, Thanksgiving, and Christmas Day
  • Paid Parental Leave
  • Up to 6% company matching 401(k) with no vesting period
  • Quarterly allowance
  • Use to make your remote work set up more comfortable, for continuing education classes, a plant for your desk, coffee for your coworker, a massage for yourself... really, whatever
  • Open concept office with friendly coworkers
  • Fulltime
Read More
Arrow Right

Senior Data Analyst

We are currently looking for an Senior Data Analyst to join a fast-growing, inno...
Location
Location
United Kingdom
Salary
Salary:
65000.00 - 75000.00 GBP / Year
dataidols.com Logo
Data Idols
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years’ experience as a Data Analyst, Data Engineer, or Analytics Engineer
  • dbt
  • Advanced SQL skills and experience with data visualisation tools (Tableau preferred)
  • Knowledge of data modelling, warehousing, and analytics best practices
  • Strong communication skills with the ability to explain technical findings clearly
Job Responsibility
Job Responsibility
  • Design, build, and maintain data models and pipelines
  • Create engaging dashboards and visualisations to present findings to non-technical audiences
  • Collaborate with stakeholders to translate business needs into data-driven outcomes
  • Use analytics to uncover trends, opportunities, and risks that shape company strategy
  • Champion data best practices and innovation within the wider team
What we offer
What we offer
  • Remote working
  • L&D budget
  • Bonus
  • £2,500 personal development budget for certifications, training, and learning
  • Health insurance (where applicable)
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

This project is designed for consulting companies that provide analytics and pre...
Location
Location
Salary
Salary:
Not provided
lightpointglobal.com Logo
Lightpoint Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • successfully implemented and released data integration services or APIs using modern Python frameworks in the past 4 years
  • successfully designed data models and schemas for analytics or data warehousing solutions
  • strong analysis and problem solving skills
  • strong knowledge of Python programming language and data engineering
  • deep understanding of good programming practices, design patterns, and software architecture principles
  • ability to work as part of a team by contributing to product backlog reviews and solution design and implementation
  • be disciplined in implementing software in a timely manner while ensuring product quality isn't compromised
  • formal training in software engineering, computer science, computer engineering, or data engineering
  • have working knowledge with Apache Airflow or a similar technology for workflow orchestration
  • have working knowledge with dbt (data build tool) for analytics transformation workflows
Job Responsibility
Job Responsibility
  • work in an agile team to design, develop, and implement data integration services that connect diverse data sources including event tracking platforms (GA4, Segment), databases, APIs, and third-party systems
  • build and maintain robust data pipelines using Apache Airflow, dbt, and Spark to orchestrate complex workflows and transform raw data into analytics-ready datasets in Snowflake
  • develop Python-based integration services and APIs that enable seamless data flow between various data technologies and downstream applications
  • collaborate actively with data analysts, analytics engineers, and platform teams to understand requirements, troubleshoot data issues, and optimize pipeline performance
  • participate in code reviews, sprint planning, and retrospectives to ensure high-quality, production-ready code by end of each sprint
  • contribute to the continuous improvement of data platform infrastructure, development practices, and deployment processes in accordance with CI/CD best practices
  • Fulltime
Read More
Arrow Right

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right