CrawlJobs Logo

Data Engineer 2

jll.com Logo

JLL

Location Icon

Location:
India , Bengaluru

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a Data Engineer P2 who is a self-starter to work in a diverse and fast-paced environment as part of our Enterprise Data team. This individual contributor role is responsible for designing and developing data solutions that are strategic to the business and built on the latest technologies and patterns. This is a global role that requires partnering with the broader JLLT team at the country, regional, and global levels by utilizing in-depth knowledge of data, infrastructure, technologies, and data engineering experience.

Job Responsibility:

  • Design, develop, and maintain scalable and efficient cloud-based data infrastructure using SQL and PySpark
  • Collaborate with cross-functional teams to understand data requirements, identify potential data sources, and define data ingestion architecture
  • Design and implement efficient data pipeline frameworks, ensuring the smooth flow of data from various sources to data lakes, data warehouses, and analytical platforms
  • Troubleshoot and resolve issues related to data processing, data quality, and data pipeline performance
  • Stay updated with emerging technologies, tools, and best practices in cloud data engineering, SQL, and PySpark
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet their needs
  • Document data infrastructure, data pipelines, and ETL processes, ensuring knowledge transfer and smooth handovers
  • Create complex automated tests and integrate them into testing frameworks

Requirements:

  • Bachelor's degree in Computer Science, Data Engineering, or a related field (Master's degree preferred)
  • Minimum 3-5 years of experience in data engineering or full-stack development, with a focus on cloud-based environments
  • Strong expertise in managing big data technologies (Python, SQL, PySpark, Spark) with a proven track record of working on large-scale data projects
  • Strong Databricks experience
  • Strong database/backend testing with the ability to write complex SQL queries for data validation and integrity
  • Strong streaming and real-time API/service validation including automation
  • Experience with automated web services (WSDL) and microservices (REST) using custom scripts and assertions for data validation and data-driven testing
  • Experience with cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP)
  • Proficiency in object-oriented programming and software design patterns
  • Experience working in DevOps model, including installing, configuring, and integrating automation scripts on continuous integration tools (CI/CD) and GitHub for real-time test suite execution and troubleshooting
  • Experience with Unit, Functional, Integration, User Acceptance, System, and Security testing of data pipelines
  • Strong experience in designing and implementing data pipelines, ETL processes, and workflow automation
  • Familiarity with data warehousing concepts, dimensional modeling, data governance best practices, and cloud-based data warehousing platforms (e.g., AWS Redshift, Google BigQuery, Snowflake)
  • Familiarity with cutting-edge AI technologies and demonstrated ability to rapidly learn and adapt to emerging concepts and frameworks
  • Strong problem-solving skills and ability to analyze complex data processing issues
  • Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams
  • Attention to detail and commitment to delivering high-quality, reliable data solutions
  • Ability to adapt to evolving technologies and work effectively in a fast-paced, dynamic environment

Additional Information:

Job Posted:
February 20, 2026

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer 2

Data Engineer

As a Data Engineer at Rearc, you'll contribute to the technical excellence of ou...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
rearc.io Logo
Rearc
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of experience in data engineering, data architecture, or related fields
  • Solid track record of contributing to complex data engineering projects
  • Hands-on experience with ETL processes, data warehousing, and data modelling tools
  • Good understanding of data integration tools and best practices
  • Familiarity with cloud-based data services and technologies (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery)
  • Strong analytical skills
  • Proficiency in implementing and optimizing data pipelines using modern tools and frameworks
  • Strong communication and interpersonal skills
Job Responsibility
Job Responsibility
  • Collaborate with Colleagues to understand customers' data requirements and challenges
  • Apply DataOps Principles to create scalable and efficient data pipelines and architectures
  • Support Data Engineering Projects
  • Promote Knowledge Sharing through technical blogs and articles
Read More
Arrow Right

Data Engineer

Become a player in our data engineering team, grow on a personal level and help ...
Location
Location
Serbia , Novi Beograd
Salary
Salary:
Not provided
mdpi.com Logo
MDPI
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • A university degree, ideally in Computer Science or related science, technology or engineering field
  • 2+ years of relevant work experience in data engineering roles
  • Experience in data acquisition, laking, warehousing, modeling, and orchestration
  • Proficiency in SQL (including window functions and CTE)
  • Proficiency in RDBMS (e.g., MySQL, PostgreSQL)
  • Strong programming skills in Python (with libraries like Polars, optionally Arrow / PyArrow API)
  • First exposure to OLAP query engines (e.g., Clickhouse, DuckDB, Apache Spark)
  • Familiarity with Apache Airflow (or similar tools like Dagster or Prefect)
  • Strong teamwork and communication skills
  • Ability to work independently and manage your time effectively
Job Responsibility
Job Responsibility
  • Assist in designing, building, and maintaining efficient data pipelines
  • Work on data modeling tasks to support the creation and maintenance of data warehouses
  • Integrate data from multiple sources, ensuring data consistency and reliability
  • Collaborate in implementing and managing data orchestration processes and tools
  • Help establish monitoring systems to maintain high standards of data quality and availability
  • Work closely with the Data Architect, Senior Data Engineers, and other members across the organization on various data infrastructure projects
  • Participate in the optimization of data processes, seeking opportunities to enhance system performance
What we offer
What we offer
  • Competitive salary and benefits package
Read More
Arrow Right

Data Engineer

We are seeking a skilled and innovative Data Engineer to join our team in Nieuwe...
Location
Location
Netherlands , Nieuwegein
Salary
Salary:
3000.00 - 6000.00 EUR / Month
https://www.soprasteria.com Logo
Sopra Steria
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BSc or MSc degree in IT or a related field
  • Minimum of 2 years of relevant work experience in data engineering
  • Proficiency in building data pipelines using tools such as Azure Data Factory, Informatica Cloud, Synapse Pro, Spark, Python, R, Kubernetes, Snowflake, Databricks, or AWS
  • Advanced SQL knowledge and experience with relational databases
  • Hands-on experience in data modelling and data integration (both on-premise and cloud-based)
  • Strong problem-solving skills and analytical mindset
  • Knowledge of data warehousing concepts and big data technologies
  • Experience with version control systems, preferably Git
  • Excellent communication skills and ability to work collaboratively in a team environment
  • Fluency in Dutch language (required)
Job Responsibility
Job Responsibility
  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes
  • Collaborate with Information Analysts to provide technical frameworks for business requirements of medium complexity
  • Contribute to architecture discussions and identify potential technical and process bottlenecks
  • Implement data quality checks and ensure data integrity throughout the data lifecycle
  • Optimise data storage and retrieval systems for improved performance
  • Work closely with cross-functional teams to understand data needs and deliver efficient solutions
  • Stay up-to-date with emerging technologies and best practices in data engineering
  • Troubleshoot and resolve data-related issues in a timely manner
  • Document data processes, architectures, and workflows for knowledge sharing and future reference
What we offer
What we offer
  • A permanent contract and a gross monthly salary between €3,000 and €6,000 (based on 40 hours per week)
  • 8% holiday allowance
  • A generous mobility budget, including options such as an electric lease car with an NS Business Card, a lease bike, or alternative transportation that best suits your travel needs
  • 8% profit sharing on target (or a fixed OTB amount, depending on the role)
  • 27 paid vacation days
  • A flex benefits budget of €1,800 per year, plus an additional percentage of your salary. This can be used for things like purchasing extra vacation days or contributing more to your pension
  • A home office setup with a laptop, phone, and a monthly internet allowance
  • Hybrid working: from home or at the office, depending on what works best for you
  • Development opportunities through training, knowledge-sharing sessions, and inspiring (networking) events
  • Social activities with colleagues — from casual drinks to sports and content-driven outings
  • Fulltime
Read More
Arrow Right

Sap Btp Data Engineer

At LeverX, we have had the privilege of working on over 950 SAP projects, includ...
Location
Location
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of experience designing and developing SAP data solutions within SAP and non-SAP enterprise landscapes
  • Strong knowledge of data modeling in Data Warehouses
  • Strong knowledge of the visualization patterns, approaches, and techniques in SAP and non-SAP landscapes
  • Understanding of Data Engineering solutions within the SAP BDC landscape, such as SAP Databricks
  • Proven experience in data transformation and integration with SAP ERP, S/4HANA, and equivalent external systems
  • Good understanding of SAP data integration techniques (SDI, SDA, APIs, ODP) and protocols (OData, REST, JDBC)
  • Bachelor’s degree in Computer Science, Information Systems, or equivalent
  • English B2+
Job Responsibility
Job Responsibility
  • Design, develop, and deploy enterprise data solutions on SAP BTP, integrating SAP and non-SAP systems
  • Analyze and resolve complex data and integration challenges, ensuring reliable and scalable solutions
  • Collaborate with data architects, functional analysts, and business stakeholders to translate requirements into data models, dashboards, and analytics
  • Lead small project teams (2–3 members) and contribute to cross-regional collaboration for consistent delivery
  • Facilitate client enablement through workshops, webinars, and hands-on sessions
  • Continuously grow expertise by staying current with SAP data and analytics technologies (e.g., SAP Business Data Cloud, AI/ML) and pursuing relevant certifications
What we offer
What we offer
  • 89% of projects use the newest SAP technologies and frameworks
  • Expert communities and internal courses
  • Valuable perks to support your growth and well-being
  • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay in the company for 4+ years
Read More
Arrow Right

Software Engineer 2 / Senior Software Engineer

We are looking for an experienced Software Engineers for our Bangalore location ...
Location
Location
India , Bengaluru
Salary
Salary:
Not provided
komprise.com Logo
Komprise, Inc.
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Solid grasp of computer science fundamentals and especially data structures, algorithms, multi-threading
  • Ability to solve difficult problems with a simple elegant solution
  • Should have solid object-oriented programming background with impeccable design skills
  • Experience in developing management applications and performance management applications is ideal
  • Experience with object-based file systems and REST interfaces is a plus (e.g. Amazon S3, Azure, Google Cloud Service)
  • Should have a BE or higher in CS, EE, Math or related engineering or science field
  • At least 5+ years of experience in software deployment
  • Tech Stack: Java, Maven Virtualisation, SaaS, Github, Jira, Slack, Cloud Solutions and Hypervisors
Job Responsibility
Job Responsibility
  • Responsible for designing and developing features that powers Komprise data management platform to manage billions of files and petabytes of data
  • Responsible for designing of major components and systems of our product architecture, ensuring that Komprise data management platform is highly available and scalable
  • Responsible for writing performance code, evaluate feasibility, develop for quality and optimize for maintainability
  • Work in agile, customer focused and fast paced team with direct interaction with the customers
  • Responsible for analysing customer escalated issues and provide resolutions in a timely manner
  • Should be able to design and implement highly performant, scalable distributed systems
Read More
Arrow Right

Data Engineer

We're seeking an experienced Data Engineer to join our Data Platform team. This ...
Location
Location
Chile , Santiago
Salary
Salary:
39100000.00 - 46000000.00 CLP / Year
https://checkr.com Logo
Checkr
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of industry-related experience in a backend or data engineering role and a Bachelor’s degree or equivalent experience
  • Programming expertise in Python or SQL. Must have proficiency in one and at least experience in the other
  • Experience developing and maintaining production data services
  • Experience with data modeling, security, and governance
  • Familiarity with modern CI/CD practices and tools (e.g., gitlab and kubernetes)
  • Experience and passion for mentoring other data engineers
Job Responsibility
Job Responsibility
  • Build, maintain and optimize critical data pipelines that serve as the foundation for Checkr’s data platform and products
  • Build tools that help streamline the management and operation of our data ecosystem
  • Architect systems for scale and security to keep up with a huge influx of data as Checkr continues to grow
  • Architect systems that empower repeatable and scalable machine learning workflows
  • Identify innovative applications of data that can enable new products or insights and enable other teams at Checkr to maximize their own impact
What we offer
What we offer
  • A collaborative and fast-moving environment
  • Be part of an international company based in the United States
  • Learning and development reimbursement allowance
  • Competitive compensation and opportunity for professional and personal advancement
  • 100% medical, dental, and vision coverage for employees and dependents
  • Additional vacation benefits of 5 extra days and flexibility to take time off
  • In-office perks are provided, such as lunch four times a week, a commuter stipend, and an abundance of snacks and beverages
  • Fulltime
Read More
Arrow Right

Senior Software Engineer, Data Platform

We are looking for a foundational member of the Data Team to enable Skydio to ma...
Location
Location
United States , San Mateo
Salary
Salary:
180000.00 - 240000.00 USD / Year
skydio.com Logo
Skydio
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience
  • 2+ years in software engineering
  • 2+ years in data engineering with a bias towards getting your hands dirty
  • Deep experience with Databricks building pipelines, managing datasets, and developing dashboards or analytical applications
  • Proven track record of operating scalable data platforms, defining company-wide patterns that ensure reliability, performance, and cost effectiveness
  • Proficiency in SQL and at least one modern programming language (we use Python)
  • Comfort working across the full data stack — from ingestion and transformation to orchestration and visualization
  • Strong communication skills, with the ability to collaborate effectively across all levels and functions
  • Demonstrated ability to lead technical direction, mentor teammates, and promote engineering excellence and best practices across the organization
  • Familiarity with AI-assisted data workflows, including tools that accelerate data transformations or enable natural-language interfaces for analytics
Job Responsibility
Job Responsibility
  • Design and scale the data infrastructure that ingests live telemetry from tens of thousands of autonomous drones
  • Build and evolve our Databricks and Palantir Foundry environments to empower every Skydian to query data, define jobs, and build dashboards
  • Develop data systems that make our products truly data-driven — from predictive analytics that anticipate hardware failures, to 3D connectivity mapping, to in-depth flight telemetry analysis
  • Create and integrate AI-powered tools for data analysis, transformation, and pipeline generation
  • Champion a data-driven culture by defining and enforcing best practices for data quality, lineage, and governance
  • Collaborate with autonomy, manufacturing, and operations teams to unify how data flows across the company
  • Lead and mentor data engineers, analysts, and stakeholders across Skydio
  • Ensure platform reliability by implementing robust monitoring, observability, and contributing to the on-call rotation for critical data systems
What we offer
What we offer
  • Equity in the form of stock options
  • Comprehensive benefits packages
  • Relocation assistance may also be provided for eligible roles
  • Paid vacation time
  • Sick leave
  • Holiday pay
  • 401K savings plan
  • Fulltime
Read More
Arrow Right

Data Engineer

We build simple yet innovative consumer products and developer APIs that shape h...
Location
Location
United States , San Francisco
Salary
Salary:
163200.00 - 223200.00 USD / Year
plaid.com Logo
Plaid
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 2+ years of dedicated data engineering experience, solving complex data pipeline issues at scale
  • Experience building data models and data pipelines on top of large datasets (in the order of 500TB to petabytes)
  • Value SQL as a flexible and extensible tool and are comfortable with modern SQL data orchestration tools like DBT, Mode, and Airflow
Job Responsibility
Job Responsibility
  • Understanding different aspects of the Plaid product and strategy to inform golden dataset choices, design and data usage principles
  • Have data quality and performance top of mind while designing datasets
  • Advocating for adopting industry tools and practices at the right time
  • Owning core SQL and Python data pipelines that power our data lake and data warehouse
  • Well-documented data with defined dataset quality, uptime, and usefulness
What we offer
What we offer
  • medical, dental, vision, and 401(k)
  • Fulltime
Read More
Arrow Right