CrawlJobs Logo

Senior Data Engineer – Data Engineering & AI Platforms

optisolbusiness.com Logo

OptiSol Business Solutions

Location Icon

Location:
India , Chennai, Madurai, Coimbatore

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for a highly skilled Senior Data Engineer (L2) who can design, build, and optimize modern data platforms while contributing to solutioning and ODC delivery engagements. This opportunity is ideal for individuals who thrive in fast-paced environments, can balance hands-on engineering with leadership responsibilities, and want to grow in the Data & AI engineering space.

Job Responsibility:

  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations

Requirements:

  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams

Nice to have:

  • Demonstrated experience in leading technical teams or delivery squads
  • Prior involvement in architecture and solution design for data engineering programs
  • Certifications in cloud or data engineering (Azure DP-203, AWS Data Analytics, etc.)
  • Knowledge of data governance, quality, and compliance frameworks
  • Exposure to large-scale cloud data migrations or modernization initiatives
What we offer:
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility

Additional Information:

Job Posted:
December 11, 2025

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer – Data Engineering & AI Platforms

Senior Data Engineer

We are looking for a Senior Data Engineer (SDE 3) to build scalable, high-perfor...
Location
Location
India , Mumbai
Salary
Salary:
Not provided
https://cogoport.com/ Logo
Cogoport
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of experience in data engineering, working with large-scale distributed systems
  • Strong proficiency in Python, Java, or Scala for data processing
  • Expertise in SQL and NoSQL databases (PostgreSQL, Cassandra, Snowflake, Apache Hive, Redshift)
  • Experience with big data processing frameworks (Apache Spark, Flink, Hadoop)
  • Hands-on experience with real-time data streaming (Kafka, Kinesis, Pulsar) for logistics use cases
  • Deep knowledge of AWS/GCP/Azure cloud data services like S3, Glue, EMR, Databricks, or equivalent
  • Familiarity with Airflow, Prefect, or Dagster for workflow orchestration
  • Strong understanding of logistics and supply chain data structures, including freight pricing models, carrier APIs, and shipment tracking systems
Job Responsibility
Job Responsibility
  • Design and develop real-time and batch ETL/ELT pipelines for structured and unstructured logistics data (freight rates, shipping schedules, tracking events, etc.)
  • Optimize data ingestion, transformation, and storage for high availability and cost efficiency
  • Ensure seamless integration of data from global trade platforms, carrier APIs, and operational databases
  • Architect scalable, cloud-native data platforms using AWS (S3, Glue, EMR, Redshift), GCP (BigQuery, Dataflow), or Azure
  • Build and manage data lakes, warehouses, and real-time processing frameworks to support analytics, machine learning, and reporting needs
  • Optimize distributed databases (Snowflake, Redshift, BigQuery, Apache Hive) for logistics analytics
  • Develop streaming data solutions using Apache Kafka, Pulsar, or Kinesis to power real-time shipment tracking, anomaly detection, and dynamic pricing
  • Enable AI-driven freight rate predictions, demand forecasting, and shipment delay analytics
  • Improve customer experience by providing real-time visibility into supply chain disruptions and delivery timeline
  • Ensure high availability, fault tolerance, and data security compliance (GDPR, CCPA) across the platform
What we offer
What we offer
  • Work with some of the brightest minds in the industry
  • Entrepreneurial culture fostering innovation, impact, and career growth
  • Opportunity to work on real-world logistics challenges
  • Collaborate with cross-functional teams across data science, engineering, and product
  • Be part of a fast-growing company scaling next-gen logistics platforms using advanced data engineering and AI
  • Fulltime
Read More
Arrow Right

Senior AI Data Engineer

We are looking for a Senior AI Data Engineer to join an exciting project for our...
Location
Location
Poland , Warsaw
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Degree in Computer Science, Data Science, Artificial Intelligence, or a related field
  • Several years of experience in AI and Machine Learning development, preferably in Customer Care solutions
  • Strong proficiency in Python and NLP frameworks
  • Hands-on experience with Azure AI services (e.g., Azure Machine Learning, Cognitive Services, Bot Services)
  • Solid understanding of cloud architectures and microservices on Azure
  • Experience with CI/CD pipelines and MLOps
  • Excellent leadership and communication skills
  • Analytical mindset with strong problem-solving abilities
  • Polish and English at a minimum B2 level.
Job Responsibility
Job Responsibility
  • Lead the development and implementation of AI-powered features for a Customer Care platform
  • Design and deploy Machine Learning and NLP models to automate customer inquiries
  • Collaborate with DevOps and cloud architects to ensure a high-performance, scalable, and secure Azure-based architecture
  • Optimize AI models to enhance customer experience
  • Integrate Conversational AI, chatbots, and language models into the platform
  • Evaluate emerging technologies and best practices in Artificial Intelligence
  • Mentor and guide a team of AI/ML developers.
What we offer
What we offer
  • Flexible working hours
  • Hybrid work model, allowing employees to divide their time between home and modern offices in key Polish cities
  • A cafeteria system that allows employees to personalize benefits by choosing from a variety of options
  • Generous referral bonuses, offering up to PLN6,000 for referring specialists
  • Additional revenue sharing opportunities for initiating partnerships with new clients
  • Ongoing guidance from a dedicated Team Manager for each employee
  • Tailored technical mentoring from an assigned technical leader, depending on individual expertise and project needs
  • Dedicated team-building budget for online and on-site team events
  • Opportunities to participate in charitable initiatives and local sports programs
  • A supportive and inclusive work culture with an emphasis on diversity and mutual respect.
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer role at UpGuard supporting analytics teams to extract insig...
Location
Location
Australia , Sydney; Melbourne; Brisbane; Hobart
Salary
Salary:
Not provided
https://www.upguard.com Logo
UpGuard
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience with data sourcing, storage and modelling to effectively deliver business value right through to BI platform
  • AI first mindset and experience scaling an Analytics and BI function at another SaaS business
  • Experience with Looker (Explores, Looks, Dashboards, Developer interface, dimensions and measures, models, raw SQL queries)
  • Experience with CloudSQL (PostgreSQL) and BigQuery (complex queries, indices, materialised views, clustering, partitioning)
  • Experience with Containers, Docker and Kubernetes (GKE)
  • Familiarity with n8n for automation
  • Experience with programming languages (Go for ETL workers)
  • Comfortable interfacing with various APIs (REST+JSON or MCP Server)
  • Experience with version control via GitHub and GitHub Flow
  • Security-first mindset
Job Responsibility
Job Responsibility
  • Design, build, and maintain reliable data pipelines to consolidate information from various internal systems and third-party sources
  • Develop and manage comprehensive semantic layer using technologies like LookML, dbt or SQLMesh
  • Implement and enforce data quality checks, validation rules, and governance processes
  • Ensure AI agents have access to necessary structured and unstructured data
  • Create clear, self-maintaining documentation for data models, pipelines, and semantic layer
What we offer
What we offer
  • Great Place to Work certified company
  • Equal Employment Opportunity and Affirmative Action employer
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Senior Data Engineer to design, develop, and optimize data platforms, pipelines,...
Location
Location
United States , Chicago
Salary
Salary:
160555.00 - 176610.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master's degree in Engineering Management, Software Engineering, Computer Science, or a related technical field
  • 3 years of experience in data engineering
  • Experience building data platforms and pipelines
  • Experience with AWS, GCP or Azure
  • Experience with SQL and Python for data manipulation, transformation, and automation
  • Experience with Apache Airflow for workflow orchestration
  • Experience with data governance, data quality, data lineage and metadata management
  • Experience with real-time data ingestion tools including Pub/Sub, Kafka, or Spark
  • Experience with CI/CD pipelines for continuous deployment and delivery of data products
  • Experience maintaining technical records and system designs
Job Responsibility
Job Responsibility
  • Design, develop, and optimize data platforms, pipelines, and governance frameworks
  • Enhance business intelligence, analytics, and AI capabilities
  • Ensure accurate data flows and push data-driven decision-making across teams
  • Write product-grade performant code for data extraction, transformations, and loading (ETL) using SQL/Python
  • Manage workflows and scheduling using Apache Airflow and build custom operators for data ETL
  • Build, deploy and maintain both inbound and outbound data pipelines to integrate diverse data sources
  • Develop and manage CI/CD pipelines to support continuous deployment of data products
  • Utilize Google Cloud Platform (GCP) tools, including BigQuery, Composer, GCS, DataStream, and Dataflow, for building scalable data systems
  • Implement real-time data ingestion solutions using GCP Pub/Sub, Kafka, or Spark
  • Develop and expose REST APIs for sharing data across teams
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Annual incentive program
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

As a Senior Software Engineer, you will play a key role in designing and buildin...
Location
Location
United States
Salary
Salary:
156000.00 - 195000.00 USD / Year
apollo.io Logo
Apollo.io
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Excellent communication skills
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open
Job Responsibility
Job Responsibility
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing
What we offer
What we offer
  • Equity
  • Company bonus or sales commissions/bonuses
  • 401(k) plan
  • At least 10 paid holidays per year
  • Flex PTO
  • Parental leave
  • Employee assistance program and wellbeing benefits
  • Global travel coverage
  • Life/AD&D/STD/LTD insurance
  • FSA/HSA and medical, dental, and vision benefits
  • Fulltime
Read More
Arrow Right

Senior Manager, Data Engineering

You will build a team of talented engineers that will work cross functionally to...
Location
Location
United States , San Jose
Salary
Salary:
240840.00 - 307600.00 USD / Year
archer.com Logo
Archer Aviation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 6+ years of experience in a similar role, 2 of which are in a data leadership role
  • B.S. in a quantitative discipline such as Computer Science, Computer Engineering, Electrical Engineering, Mathematics, or a related field
  • Expertise with data engineering disciplines including data warehousing, database management, ETL processes, and ML model deployment
  • Experience with processing and storing telemetry data
  • Demonstrated experience with data governance standards and practices
  • 3+ years leading teams, including building and recruiting data engineering teams supporting diverse stakeholders
  • Experience with cloud-based data platforms such as AWS, GCP, or Azure
Job Responsibility
Job Responsibility
  • Lead and continue to build a world-class team of engineers by providing technical guidance and mentorship
  • Design and implement scalable data infrastructure to ingest, process, store, and access multiple data supporting flight test, manufacturing and supply chain, and airline operations
  • Take ownership of data infrastructure to enable a highly scalable and cost-effective solution serving the needs of various business units
  • Build and support the development of novel tools to enable insight and decision making with teams across the organization
  • Evolve data engineering and AI strategy to align with the short and long term priorities of the organization
  • Help to establish a strong culture of data that is used throughout the company and industry
  • Lead initiatives to integrate AI capabilities in new and existing tools
  • Fulltime
Read More
Arrow Right

Principal Consulting AI / Data Engineer

As a Principal Consulting AI / Data Engineer, you will design, build, and optimi...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
dyflex.com.au Logo
DyFlex Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven expertise in delivering enterprise-grade data engineering and AI solutions in production environments
  • Strong proficiency in Python and SQL, plus experience with Spark, Airflow, dbt, Kafka, or Flink
  • Experience with cloud platforms (AWS, Azure, or GCP) and Databricks
  • Ability to confidently communicate and present at C-suite level, simplifying technical concepts into business impact
  • Track record of engaging senior executives and influencing strategic decisions
  • Strong consulting and stakeholder management skills with client-facing experience
  • Background in MLOps, ML pipelines, or AI solution delivery highly regarded
  • Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable data and AI solutions using Databricks, cloud platforms, and modern frameworks
  • Lead solution architecture discussions with clients, ensuring alignment of technical delivery with business strategy
  • Present to and influence executive-level stakeholders, including boards, C-suite, and senior directors
  • Translate highly technical solutions into clear business value propositions for non-technical audiences
  • Mentor and guide teams of engineers and consultants to deliver high-quality solutions
  • Champion best practices across data engineering, MLOps, and cloud delivery
  • Build DyFlex’s reputation as a trusted partner in Data & AI through thought leadership and client advocacy
What we offer
What we offer
  • Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
  • A flexible and supportive work environment including work from home
  • Competitive remuneration and benefits including novated lease, birthday leave, salary packaging, wellbeing programme, additional purchased leave, and company-provided laptop
  • Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
  • Structured career advancement pathways with opportunities to lead large-scale client programs
  • Exposure to diverse industries and client environments, including executive-level engagement
  • Fulltime
Read More
Arrow Right

Senior AI Engineer

Elsewhen, a London-based consultancy, designs and builds technology solutions fo...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
elsewhen.com Logo
Elsewhen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Professional AI engineering experience
  • Background in Software Engineering with Python
  • Solid understanding of the Python standard library and modern Python coding, testing, debugging and automation techniques
  • Hands-on experience building solutions using LLMs and Agentic architectures with ADK, LlamaIndex, or LangGraph
  • Working with vector databases for embedding and indexing
  • Strong experience with cloud platforms
  • Strong experience with API design and frameworks like FastAPI or Flask
  • Solid experience with relational databases and SQL
  • Interest in expanding your knowledge into GenAI and machine learning
  • Excellent communication skills and the ability to work well in a collaborative team environment
Job Responsibility
Job Responsibility
  • Experiment with POCs to find solutions for real-world problems using Large Language Models
  • Collaborate on AI-driven projects, working alongside engineers, product managers and AI specialists while maintaining clear documentation
  • Build and deploy Agentic LLM-based solutions with LangGraph
  • Familiar with different multi agent system patterns
  • Build and deploy LLM-based solutions using RAG
  • Familiar with different types of databases: Relational, Graph etc
  • Design and optimise APIs using Python and FastAPI to serve AI solutions
  • Familiar with GCP ecosystem and Cloudrun
  • Build and optimise data pipelines for vector search and knowledge retrieval using Vector databases and embedding models
What we offer
What we offer
  • Private Health Insurance: Comprehensive coverage for both physical and mental health
  • Flexible and Remote-First Work Environment: Choose how and where you work, with the option for weekly team meet-ups in central London
  • Generous Leave Policy: 27 days of holiday plus bank holidays
  • Family-friendly policies, including enhanced maternity, paternity and shared
  • Learning and Development: Individual annual budget of £2,000 for learning and development, with dedicated learning days
  • Feel Better Fund: £500 to help set up your remote office
  • Social Events: Monthly and quarterly team events, an annual team trip, and half-yearly social events
  • Gym Membership Contribution: Support for maintaining your physical health
  • Pension Contribution: Enhanced employer pension contribution of 6%
  • Bonus Opportunities: Potential to receive a discretionary (non-contractual) bonus based on business and personal achievements
Read More
Arrow Right