This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The purpose of the Data Engineer position is to leverage their technical expertise and domain knowledge to design, build, and maintain efficient and robust data solutions. The Data Engineer shall be responsible for the building, testing, and deployment of Data Pipelines on the EDP. The engineer shall ensure that the pipelines are developed and deployed with a secure-by-design approach, delivering robust, thoroughly tested, and maintainable solutions.
Job Responsibility:
Leverage technical expertise and domain knowledge to design, build, and maintain efficient and robust data solutions
Be responsible for the building, testing, and deployment of Data Pipelines on the EDP
Ensure that the pipelines are developed and deployed with a secure-by-design approach, delivering robust, thoroughly tested, and maintainable solutions
Develop pipelines using the standard patterns for data pipelines and workflows utilizing Streamsets, Kestra, dbt, Git
Design and Implement data storage and processing solutions employing Snowflake
Utilize AWS services for cloud-based platform tooling infrastructure including but not limited to: Lambda, ECS, MSK, RDS, EC2, Secrets Manager, ALB, Cloud Watch, Event Bridge
Utilize Terraform for AWS and Azure deployments
Leverage and integrate APIs for data access and manipulation
Write Python scripts for data common processing and automation tasks
Leverage Platform API’s and Web Applications to enforce Platform Security
Development experience with Go, SQL, C#, .net, JavaScript, shell scripts & container platforms like Docker
Have experience integrating with timeseries source systems: Honeywell Plant Historian Database, OSI Pi
Have experience in Authentication mechanisms including but not limited to (OAuth 2.0, OIDC, Microsoft Entra, Key Pair Authentication, Certificate based authentication, SAML based SSO)
Create and execute comprehensive test plans to ensure the pipelines functionality and performance
Develop unit tests, integration tests, and end-to-end tests for data pipelines and workflows
Ensure data accuracy and consistency through rigorous testing processes
Leverage automated testing processes to enhance efficiency
Due to the Crown Jewel nature of the enterprise data platform, Data Engineers may have access to PII, Confidential and Most Confidential data
This role requires strict adherence to access process and procedures to maintain Data Privacy and Security
Identify and report any potential breaches of the Data Information and Systems Processes
Monitor and manage the platform to ensure optimal performance and uptime
Conduct regular maintenance tasks such as updates, patches, and backups
Resolve any issues or incidents related to the platform in a timely manner
Continuously improve platform operations through automation and optimization
Strong experience with Windows & Unix like operating systems
Implement security best practices throughout the pipeline development and deployment process
Conduct regular security reviews and vulnerability assessments
Ensure data encryption, access control, and other security measures are enforced
Use credential management platforms like Thycotic Secret Server, AWS Secrets Manager
Assist in troubleshooting and resolving intricate technical issues
Deliverables: Robust and scalable data pipelines with well-documented code and processes
Comprehensive test plans and automation scripts ensuring platform reliability
Regular security assessments and compliance reports
Technical support and guidance documentation for delivery data engineers
Deliver secure, robust, and maintainable data pipelines
Ensure high-quality and thoroughly tested data solutions
Maintain compliance with security standards and best practices
Maintain compliance with the Data Lifecycle Management Process
Maintain compliance with the Data Privacy standards and best practices
Requirements:
Bachelor’s or Master’s degree in Computer Science, Data Science, Information Technology, or a related field with a focus on data engineering or data analytics
Strong proficiency in programming languages such as Python, SQL, Java, or Scala for data processing and analysis
Experience with data modeling, ETL processes, data warehousing, data integration, and data pipeline development
Proficiency in relational databases (e.g., SQL, PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra)
Working knowledge of cloud platforms such as Snowflake, AWS, Azure, or Google Cloud Platform for data storage and processing
Experience with data visualization tools (e.g., Power BI)
Understanding of data quality principles, data governance, and data validation processes
Ability to analyze complex data sets, identify trends, patterns, and insights
Proficiency in troubleshooting data-related issues, identifying root causes, and implementing solutions
Familiarity with project management methodologies
Strong verbal and written communication skills
Willingness to stay updated with the latest data technologies, tools, and industry trends
Prior experience in data engineering, data analytics, or related roles with a track record of successful data project delivery
Technical Leadership: Ability to make informed, strategic decisions that align technology with business objectives
Customer Focus: Deep understanding of customer needs and how to translate them into effective technical solutions
Collaboration: Encourages collaboration across teams and stakeholders
Problem-Solving: Strong analytical and problem-solving skills, capable of addressing complex technical challenges
Innovation: Ability to lead innovation in technology while maintaining an eye on product-market fit and user experience
Agility: Adaptability in a fast-moving environment, with a mindset focused on delivering high-impact solutions quickly and iteratively
What we offer:
Commitment to your ongoing development, including on the job opportunities and formal programs
Inclusive parental leave entitlements for both parents
Values led culture
Flexible work options
Generous annual leave, sick leave and casual leave
Cultural and religious leave with flexible public holiday opportunities
A competitive remuneration package featuring performance based incentives with uncapped Employer Provident Fund