This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. If you are a passionate data engineer with expertise in ETL processes and a desire to make a significant impact within our organization, we encourage you to apply for this exciting opportunity!
Job Responsibility:
Design and Develop ETL Processes: Lead the design and implementation of ETL processes using all kinds of batch/streaming tools to extract, transform, and load data from various sources into GCP
Collaborate with stakeholders to gather requirements and ensure that ETL solutions meet business needs
Data Pipeline Optimization: Optimize data pipelines for performance, scalability, and reliability, ensuring efficient data processing workflows
Monitor and troubleshoot ETL processes, proactively addressing issues and bottlenecks
Data Integration and Management: Integrate data from diverse sources, including databases, APIs, and flat files, ensuring data quality and consistency
Manage and maintain data storage solutions in GCP (e.g., BigQuery, Cloud Storage) to support analytics and reporting
GCP Dataflow Development: Write Apache Beam based Dataflow Job for data extraction, transformation, and analysis, ensuring optimal performance and accuracy
Collaborate with data analysts and data scientists to prepare data for analysis and reporting
Automation and Monitoring: Implement automation for ETL workflows using tools like Apache Airflow or Cloud Composer, enhancing efficiency and reducing manual intervention
Set up monitoring and alerting mechanisms to ensure the health of data pipelines and compliance with SLAs
Data Governance and Security: Apply best practices for data governance, ensuring compliance with industry regulations (e.g., GDPR, HIPAA) and internal policies
Collaborate with security teams to implement data protection measures and address vulnerabilities
Documentation and Knowledge Sharing: Document ETL processes, data models, and architecture to facilitate knowledge sharing and onboarding of new team members
Conduct training sessions and workshops to share expertise and promote best practices within the team
Requirements:
Education: Bachelor’s degree in Computer Science, Information Systems, or a related field
Experience: Minimum of 5 years of industry experience in data engineering or ETL development, with a strong focus on Data Stage and GCP
Proven experience in designing and managing ETL solutions, including data modeling, data warehousing, and SQL development
Technical Skills: Strong knowledge of GCP services (e.g., BigQuery, Dataflow, Cloud Storage, Pub/Sub) and their application in data engineering
Experience of cloud-based solutions, especially in GCP, cloud certified candidate is preferred
Experience and knowledge of Bigdata data processing in batch mode and streaming mode, proficient in Bigdata eco systems, e.g. Hadoop, HBase, Hive, MapReduce, Kafka, Flink, Spark, etc.
Familiarity with Java & Python for data manipulation on Cloud/Bigdata platform
Analytical Skills: Strong problem-solving skills with a keen attention to detail
Ability to analyze complex data sets and derive meaningful insights
What we offer:
Competitive salary and comprehensive benefits package
Opportunity to work in a dynamic and collaborative environment on cutting-edge data projects
Professional development opportunities to enhance your skills and advance your career
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.