CrawlJobs Logo

Aws Data Engineer

https://www.randstad.com Logo

Randstad

Location Icon

Location:
India , Chennai

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Collaborate with Product teams ensuring that raw data is cleansed and transformed and useable by downstream consumers (ML Engineers, BI analytics). Assist and advise on the re-development and modernisation of end-to-end ETL pipelines and introduce new technologies where appropriate in a real-time streaming environment dealing with very large data volumes. Working with AWS cloud infrastructure (specifically SQS, SNS, Redshift, OpenSearch, Athena, Kinesis, AWS code pipeline) to develop, innovate and maintain data flowing through various queues and data warehouses. Working with a variety of data repository platforms (including SQL stores such as Oracle), as well as implementing data visualisation and network analysis (e.g GraphdB). Maintain and ‘productionise’ machine learning and AI models. Assist in the creation of next generation data ingestion platforms – sourcing data using Webscrapes, APIs, Email, and flat file (FTP) methods. Understanding conflict resolution methods and assist subject matter experts in the debugging on data ingestion and managing overall feed uptimes across a large set of data collectors. Create and maintain detailed documentation and functional design specifications including data flows and data conversion. Provide technical information to assist in the development of client facing product documentation. Adhere to change management protocols and version control. Present advanced technical designs with non-technical stakeholders.

Job Responsibility:

  • Collaborate with Product teams ensuring that raw data is cleansed and transformed and useable by downstream consumers (ML Engineers, BI analytics)
  • Assist and advise on the re-development and modernisation of end-to-end ETL pipelines and introduce new technologies where appropriate in a real-time streaming environment dealing with very large data volumes

Requirements:

  • Expect level PL-SQL
  • Working with AWS cloud infrastructure (specifically SQS, SNS, Redshift, OpenSearch, Athena, Kinesis, AWS code pipeline)
  • Working with a variety of data repository platforms (including SQL stores such as Oracle)
  • Implementing data visualisation and network analysis (e.g GraphdB)
  • Maintain and ‘productionise’ machine learning and AI models
  • Assist in the creation of next generation data ingestion platforms – sourcing data using Webscrapes, APIs, Email, and flat file (FTP) methods
  • Understanding conflict resolution methods
  • Assist subject matter experts in the debugging on data ingestion and managing overall feed uptimes across a large set of data collectors
  • Create and maintain detailed documentation and functional design specifications including data flows and data conversion
  • Provide technical information to assist in the development of client facing product documentation
  • Adhere to change management protocols and version control
  • Present advanced technical designs with non-technical stakeholders

Additional Information:

Job Posted:
December 23, 2025

Expiration:
January 01, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Aws Data Engineer

New

AWS Data Engineer

AlgebraIT is hiring an AWS Data Engineer in Austin, Texas! If you have at least ...
Location
Location
United States , Austin
Salary
Salary:
Not provided
algebrait.com Logo
AlgebraIT
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience in data engineering with AWS
  • Proficiency in Python, SQL, and big data tools
  • Experience with AWS services such as Lambda and EC2
  • Strong communication and teamwork skills
  • Bachelor’s in Computer Science or similar
Job Responsibility
Job Responsibility
  • Develop and maintain data pipelines using AWS services
  • Automate data ingestion and processing workflows
  • Collaborate with cross-functional teams to ensure robust data solutions
  • Monitor and optimize data pipeline performance
  • Ensure data quality and implement security best practices
  • Integrate data from multiple sources for analytics
  • Implement data validation and error-handling processes
  • Write and maintain technical documentation for data workflows
  • Manage and configure cloud infrastructure related to data pipelines
  • Provide technical support and troubleshooting for data-related issues
  • Fulltime
Read More
Arrow Right

AWS Data Engineer

We are seeking a skilled AWS Data Engineer to join our team and help drive data ...
Location
Location
United States , Charlotte
Salary
Salary:
60.00 USD / Hour
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, or related field
  • 3+ years of experience in data engineering, with a focus on AWS cloud services
  • Proficiency in SQL, Python, and AWS data services such as S3, Glue, EMR, and Redshift
  • Experience with ETL processes, data modeling, and data visualization tools
  • Strong analytical and problem-solving skills
  • Excellent communication and teamwork abilities
Job Responsibility
Job Responsibility
  • Design and implement scalable and efficient data pipelines using AWS services such as S3, Glue, EMR, and Redshift
  • Develop and maintain data lakes and data warehouses to store and process large volumes of structured and unstructured data
  • Collaborate with data scientists and business analysts to deliver actionable insights and analytics solutions
  • Optimize data infrastructure for performance, reliability, and cost efficiency
  • Troubleshoot and resolve data integration and data quality issues
  • Stay current with industry trends and best practices in cloud data engineering
  • Provide technical guidance and mentorship to junior team members
Read More
Arrow Right

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Data Engineer (AWS)

Fyld is a Portuguese consulting company specializing in IT services. We bring hi...
Location
Location
Portugal , Lisboa
Salary
Salary:
Not provided
https://www.fyld.pt Logo
Fyld
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Software Engineering, Data Engineering, or related
  • Relevant certifications in AWS, such as AWS Certified Solutions Architect, AWS Certified Developer, or AWS Certified Data Analytics
  • Hands-on experience with AWS services, especially those related to Big Data and data analytics, such as Amazon Redshift, Amazon EMR, Amazon Athena, Amazon Kinesis, Amazon Glue, among others
  • Familiarity with data storage and processing services on AWS, including Amazon S3, Amazon RDS, Amazon DynamoDB, and AWS Lambda
  • Proficiency in programming languages such as Python, Scala, or Java for developing data pipelines and automation scripts
  • Knowledge of distributed data processing frameworks, such as Apache Spark or Apache Flink
  • Experience in data modeling, cleansing, transformation, and preparation for analysis
  • Ability to work with different types of data, including structured, unstructured, and semi-structured data
  • Familiarity with data architecture concepts such as data lakes, data warehouses, and data pipelines (not mandatory)
  • Knowledge of security and compliance practices on AWS, including access control, data encryption, and regulatory compliance
  • Fulltime
Read More
Arrow Right

Software Engineer - Data Engineering

Akuna Capital is a leading proprietary trading firm specializing in options mark...
Location
Location
United States , Chicago
Salary
Salary:
130000.00 USD / Year
akunacapital.com Logo
AKUNA CAPITAL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 5+ years of professional experience developing software applications
  • Java/Scala experience required
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
  • Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
  • Must possess excellent communication, analytical, and problem-solving skills
  • Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
  • Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
  • The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Job Responsibility
Job Responsibility
  • Work within a growing Data Engineering division supporting the strategic role of data at Akuna
  • Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
  • Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
  • Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
  • Mentor junior engineers in software and data engineering best practices
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
What we offer
What we offer
  • Discretionary performance bonus
  • Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
  • Fulltime
Read More
Arrow Right

Data Engineering Support Engineer / Manager

Wissen Technology is hiring a seasoned Data Engineering Support Engineer / Manag...
Location
Location
India , Mumbai; Pune
Salary
Salary:
Not provided
votredircom.fr Logo
Wissen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor of Technology or master's degree in computer science, Engineering, or related field
  • 8-12 years of work experience
  • Python, SQL
  • Familiarity with data engineering
  • Experience with AWS data and analytics services or similar cloud vendor services
  • Strong problem solving and communication skills
  • Ability to organise and prioritise work effectively
Job Responsibility
Job Responsibility
  • Incident and user management for data and analytics platform
  • Development and maintenance of Data Quality framework (including anomaly detection)
  • Implementation of Python & SQL hotfixes and working with data engineers on more complex issues
  • Diagnostic tools implementation and automation of operational processes
  • Work closely with data scientists, data engineers, and platform engineers in a highly commercial environment
  • Support research analysts and traders with issue resolution
  • Fulltime
Read More
Arrow Right
New

Senior Data Platform Engineer

We are looking for an experienced data engineer to join our platform engineering...
Location
Location
United States
Salary
Salary:
141000.00 - 225600.00 USD / Year
Axon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering, software engineering with a data focus, data science, or a related role
  • Knowledge of designing data pipelines from a variety of source (e.g. streaming, flat files, APIs)
  • Proficiency in SQL and experience with relational databases (e.g., PostgreSQL)
  • Experience with real-time data processing frameworks (e.g., Apache Kafka, Spark Streaming, Flink, Pulsar, Redpanda)
  • Strong programming skills in common data-focused languages (e.g., Python, Scala)
  • Experience with data pipeline and workflow management tools (e.g., Apache Airflow, Prefect, Temporal)
  • Familiarity with AWS-based data solutions
  • Strong understanding of data warehousing concepts and technologies (Snowflake)
  • Experience documenting data dependency maps and data lineage
  • Strong communication and collaboration skills
Job Responsibility
Job Responsibility
  • Design, implement, and maintain scalable data pipelines and infrastructure
  • Collaborate with software engineers, product managers, customer success managers, and others across the business to understand data requirements
  • Optimize and manage our data storage solutions
  • Ensure data quality, reliability, and security across the data lifecycle
  • Develop and maintain ETL processes and frameworks
  • Work with stakeholders to define data availability SLAs
  • Create and manage data models to support business intelligence and analytics
What we offer
What we offer
  • Competitive salary and 401k with employer match
  • Discretionary time off
  • Paid parental leave for all
  • Medical, Dental, Vision plans
  • Fitness Programs
  • Emotional & Development Programs
  • Snacks in our offices
Read More
Arrow Right

Senior Crypto Data Engineer

Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate...
Location
Location
Vietnam , Hanoi
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • A Master's degree in a relevant field is an added advantage
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
Job Responsibility
Job Responsibility
  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.