CrawlJobs Logo

Data Engineer (AWS)

https://www.fyld.pt Logo

Fyld

Location Icon

Location:
Portugal, Lisboa

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Fyld is a Portuguese consulting company specializing in IT services. We bring high-performance professionals into the field across a wide range of technological areas. Inspired by sports management philosophy, we strive to achieve peak performance with each of our consultants. We focus on training and excellence. Join us for the next game!

Requirements:

  • Bachelor's degree in Computer Science, Software Engineering, Data Engineering, or related
  • Relevant certifications in AWS, such as AWS Certified Solutions Architect, AWS Certified Developer, or AWS Certified Data Analytics
  • Hands-on experience with AWS services, especially those related to Big Data and data analytics, such as Amazon Redshift, Amazon EMR, Amazon Athena, Amazon Kinesis, Amazon Glue, among others
  • Familiarity with data storage and processing services on AWS, including Amazon S3, Amazon RDS, Amazon DynamoDB, and AWS Lambda
  • Proficiency in programming languages such as Python, Scala, or Java for developing data pipelines and automation scripts
  • Knowledge of distributed data processing frameworks, such as Apache Spark or Apache Flink
  • Experience in data modeling, cleansing, transformation, and preparation for analysis
  • Ability to work with different types of data, including structured, unstructured, and semi-structured data
  • Familiarity with data architecture concepts such as data lakes, data warehouses, and data pipelines (not mandatory)
  • Knowledge of security and compliance practices on AWS, including access control, data encryption, and regulatory compliance
  • Excellent verbal and written communication skills to interact with technical and non-technical teams
  • Ability to solve complex data-related problems and adapt to new challenges and technologies on AWS
  • Fluent in English

Nice to have:

  • High standards, professionalism, and dedication
  • everything else can be learned

Additional Information:

Job Posted:
March 28, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Engineer (AWS)

New

AWS Data Engineer

We are seeking a skilled AWS Data Engineer to join our team and help drive data ...
Location
Location
United States , Charlotte
Salary
Salary:
60.00 USD / Hour
realign-llc.com Logo
Realign
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Computer Science, Information Systems, or related field
  • 3+ years of experience in data engineering, with a focus on AWS cloud services
  • Proficiency in SQL, Python, and AWS data services such as S3, Glue, EMR, and Redshift
  • Experience with ETL processes, data modeling, and data visualization tools
  • Strong analytical and problem-solving skills
  • Excellent communication and teamwork abilities
Job Responsibility
Job Responsibility
  • Design and implement scalable and efficient data pipelines using AWS services such as S3, Glue, EMR, and Redshift
  • Develop and maintain data lakes and data warehouses to store and process large volumes of structured and unstructured data
  • Collaborate with data scientists and business analysts to deliver actionable insights and analytics solutions
  • Optimize data infrastructure for performance, reliability, and cost efficiency
  • Troubleshoot and resolve data integration and data quality issues
  • Stay current with industry trends and best practices in cloud data engineering
  • Provide technical guidance and mentorship to junior team members
Read More
Arrow Right
New

Senior Data Engineer – Data Engineering & AI Platforms

We are looking for a highly skilled Senior Data Engineer (L2) who can design, bu...
Location
Location
India , Chennai, Madurai, Coimbatore
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong hands-on expertise in cloud ecosystems (Azure / AWS / GCP)
  • Excellent Python programming skills with data engineering libraries and frameworks
  • Advanced SQL capabilities including window functions, CTEs, and performance tuning
  • Solid understanding of distributed processing using Spark/PySpark
  • Experience designing and implementing scalable ETL/ELT workflows
  • Good understanding of data modeling concepts (dimensional, star, snowflake)
  • Familiarity with GenAI/LLM-based integration for data workflows
  • Experience working with Git, CI/CD, and Agile delivery frameworks
  • Strong communication skills for interacting with clients, stakeholders, and internal teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable ETL/ELT pipelines across cloud and big data platforms
  • Contribute to architectural discussions by translating business needs into data solutions spanning ingestion, transformation, and consumption layers
  • Work closely with solutioning and pre-sales teams for technical evaluations and client-facing discussions
  • Lead squads of L0/L1 engineers—ensuring delivery quality, mentoring, and guiding career growth
  • Develop cloud-native data engineering solutions using Python, SQL, PySpark, and modern data frameworks
  • Ensure data reliability, performance, and maintainability across the pipeline lifecycle—from development to deployment
  • Support long-term ODC/T&M projects by demonstrating expertise during technical discussions and interviews
  • Integrate emerging GenAI tools where applicable to enhance data enrichment, automation, and transformations
What we offer
What we offer
  • Opportunity to work at the intersection of Data Engineering, Cloud, and Generative AI
  • Hands-on exposure to modern data stacks and emerging AI technologies
  • Collaboration with experts across Data, AI/ML, and cloud practices
  • Access to structured learning, certifications, and leadership mentoring
  • Competitive compensation with fast-track career growth and visibility
  • Fulltime
Read More
Arrow Right

Software Engineer - Data Engineering

Akuna Capital is a leading proprietary trading firm specializing in options mark...
Location
Location
United States , Chicago
Salary
Salary:
130000.00 USD / Year
akunacapital.com Logo
AKUNA CAPITAL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 5+ years of professional experience developing software applications
  • Java/Scala experience required
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
  • Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
  • Must possess excellent communication, analytical, and problem-solving skills
  • Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
  • Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
  • The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Job Responsibility
Job Responsibility
  • Work within a growing Data Engineering division supporting the strategic role of data at Akuna
  • Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
  • Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
  • Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
  • Mentor junior engineers in software and data engineering best practices
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
What we offer
What we offer
  • Discretionary performance bonus
  • Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
  • Fulltime
Read More
Arrow Right
New

Data Engineering Support Engineer / Manager

Wissen Technology is hiring a seasoned Data Engineering Support Engineer / Manag...
Location
Location
India , Mumbai; Pune
Salary
Salary:
Not provided
votredircom.fr Logo
Wissen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor of Technology or master's degree in computer science, Engineering, or related field
  • 8-12 years of work experience
  • Python, SQL
  • Familiarity with data engineering
  • Experience with AWS data and analytics services or similar cloud vendor services
  • Strong problem solving and communication skills
  • Ability to organise and prioritise work effectively
Job Responsibility
Job Responsibility
  • Incident and user management for data and analytics platform
  • Development and maintenance of Data Quality framework (including anomaly detection)
  • Implementation of Python & SQL hotfixes and working with data engineers on more complex issues
  • Diagnostic tools implementation and automation of operational processes
  • Work closely with data scientists, data engineers, and platform engineers in a highly commercial environment
  • Support research analysts and traders with issue resolution
  • Fulltime
Read More
Arrow Right
New

Senior Crypto Data Engineer

Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate...
Location
Location
Vietnam , Hanoi
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • A Master's degree in a relevant field is an added advantage
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
Job Responsibility
Job Responsibility
  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Kiddom is redefining how technology powers learning. We combine world-class curr...
Location
Location
United States , San Francisco
Salary
Salary:
150000.00 - 220000.00 USD / Year
kiddom.co Logo
Kiddom
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of experience as a data engineer
  • 8+ years of software engineering experience (including data engineering)
  • Proven experience as a Data Engineer or in a similar role with strong data modeling, architecture, and design skills
  • Strong understanding of data engineering principles including infrastructure deployment, governance and security
  • Experience with MySQL, Snowflake, Cassandra and familiarity with Graph databases. (Neptune or Neo4J)
  • Proficiency in SQL, Python, (Golang)
  • Proficient with AWS offerings such as AWS Glue, EKS, ECS and Lambda
  • Excellent communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders
  • Strong understanding of PII compliance and best practices in data handling and storage
  • Strong problem-solving skills, with a knack for optimizing performance and ensuring data integrity and accuracy
Job Responsibility
Job Responsibility
  • Design, implement, and maintain the organization’s data infrastructure, ensuring it meets business requirements and technical standards
  • Deploy data pipelines to AWS infrastructure such as EKS, ECS, Lambdas and AWS Glue
  • Develop and deploy data pipelines to clean and transform data to support other engineering teams, analytics and AI applications
  • Extract and deploy reusable features to Feature stores such as Feast or equivalent
  • Evaluate and select appropriate database technologies, tools, and platforms, both on-premises and in the cloud
  • Monitor data systems and troubleshoot issues related to data quality, performance, and integrity
  • Work closely with other departments, including Product, Engineering, and Analytics, to understand and cater to their data needs
  • Define and document data workflows, pipelines, and transformation processes for clear understanding and knowledge sharing
What we offer
What we offer
  • Meaningful equity
  • Health insurance benefits: medical (various PPO/HMO/HSA plans), dental, vision, disability and life insurance
  • One Medical membership (in participating locations)
  • Flexible vacation time policy (subject to internal approval). Average use 4 weeks off per year
  • 10 paid sick days per year (pro rated depending on start date)
  • Paid holidays
  • Paid bereavement leave
  • Paid family leave after birth/adoption. Minimum of 16 paid weeks for birthing parents, 10 weeks for caretaker parents. Meant to supplement benefits offered by State
  • Commuter and FSA plans
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

For this role, we are seeking a Senior Data Engineer for our Client's ETL Suppor...
Location
Location
India
Salary
Salary:
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • In-depth knowledge of AWS Glue, AWS Lambda, and AWS Step Functions
  • A deep understanding of ETL processes and data warehouse design
  • Proven ability to troubleshoot data pipelines and perform root cause analysis (RCA)
  • 3-5 years of relevant experience
  • Hands-on experience with Glue, Lambda, and Step Function development
  • Must be able to work a day shift that includes coverage for weekends and holidays on a rotational basis
Job Responsibility
Job Responsibility
  • Monitor approximately 2,300 scheduled jobs (daily, weekly, monthly) to ensure timely and successful execution
  • Execute on-demand jobs as required by the business
  • Troubleshoot job failures, perform detailed root cause analysis (RCA), and provide clear documentation for all findings
  • Address and resolve bugs and data-related issues reported by the business team
  • Verify source file placement in designated directories to maintain data integrity
  • Reload Change Data Capture (CDC) tables when structural changes occur in source systems
  • Help manage synchronization between external databases (including Teradata write-backs) and AWS Glue tables
  • Assist in developing new solutions, enhancements, and bug fixes using AWS Glue, Lambda, and Step Functions
  • Answer questions from the business and support User Acceptance Testing (UAT) inquiries
  • Make timely decisions to resolve issues, execute tasks efficiently, and escalate complex problems to senior or lead engineers as needed, all while maintaining agreed-upon SLAs
  • Fulltime
Read More
Arrow Right
New

Senior Data Solutions Architect with AWS

Provectus, a leading AI consultancy and solutions provider specializing in Data ...
Location
Location
Poland , Wroclaw
Salary
Salary:
Not provided
provectus.com Logo
Provectus
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
  • Minimum of 8 years of experience in data solution architecture, with at least 3 years focused on AWS
  • Proven experience in designing and implementing large-scale data engineering solutions on AWS
  • Experience with Databricks is a plus
  • Deep expertise in AWS platform services, including S3, EC2, Lambda, EMR, Glue, Redshift, AWS MSK, and EKS
  • Proficient in programming languages like Python, SQL, and Scala
  • Experience with data warehousing, ETL processes, and real-time data streaming
  • Familiarity with open-source technologies and tools commonly used in data engineering
  • AWS Certified Solutions Architect – Professional or similar AWS certifications are a plus
  • Excellent communication and presentation skills, with the ability to articulate complex technical concepts to non-technical stakeholders
Job Responsibility
Job Responsibility
  • Lead complex, high-impact customer engagements focused on AWS Data Platform solutions
  • Define and drive technical strategies that align AWS capabilities with customer business objectives, incorporating Databricks (as a plus) solutions where appropriate
  • Architect and design scalable data platforms using AWS, ensuring optimal performance, reliability, security, and cost efficiency
  • Evaluate and select appropriate technologies and tools to meet customer needs, integrating AWS services with other solutions such as Databricks, and Snowflake as necessary
  • Establish, fulfill, and maintain comprehensive architectural documentation to ensure alignment with technical standards and best practices across the organization
  • Collaborate with the sales team during the pre-sales process by providing technical expertise to position AWS-based data solutions effectively
  • Participate in customer meetings to assess technical needs, scope potential solutions, and identify opportunities for growth
  • Create technical proposals, solution architectures, and presentations to support sales efforts and ensure alignment with customer expectations
  • Assist in responding to RFPs/RFIs by providing accurate technical input and aligning solutions to client requirements
  • Demonstrate AWS capabilities through POCs (Proof of Concepts) and technical demonstrations to help customers evaluate the proposed solutions
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.