This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This role focuses on automating data movement and transformation in a banking environment, developing data pipelines, testing frameworks, and leveraging containerization and cloud platforms. The position involves designing scalable ETL processes, leading automation strategies, and collaborating with stakeholders to deliver reliable data solutions.
Job Responsibility:
Create programs using PySpark to extract data from various sources, clean and transform it, and load it into target systems
Develop automated tests to ensure the data pipelines are working correctly and the data is accurate
Package data pipelines into containers using Docker and manage their execution using orchestration tools like AWS EKS
Work with various cloud services like AWS S3, Lambda, and Airflow for data storage, processing, and scheduling
Oversee test data strategies and environment simulations for scalable, reliable automation
Build and maintain ETL validation and testing scripts that run on Red Hat OpenShift containers
Extract, transform, and load datasets from Hive, HDFS, and Oracle data sources
Deploy and orchestrate ETL jobs using AWS EKS and integrate them into workflows
Create reusable frameworks, libraries, and templates to accelerate automation and testing of ETL jobs
Participate in code reviews, CI/CD pipelines, and maintain best practices in Spark and cloud-native development
Lead a team of automation professionals, guiding them on projects and helping them develop their skills
Define the overall strategy for automating data processes and testing
Research and implement new automation tools and techniques to improve efficiency
Collaborate with other teams and partners to ensure smooth data operations and meet regulatory requirements
Track key performance indicators related to automation for the entire D&A team and report on progress to leadership
Requirements:
12-15 years of experience on automation testing across UI, Data analytics and BI reports in the Financial Service industry especially with knowledge of regulatory compliance and risk management
Extensive knowledge on developing and maintaining automation frameworks, AI/ML related solutions
Detailed knowledge data flows in relational database and Bigdata (Familiarity with Hadoop (a platform for processing massive datasets))
Selenium BDD Cucumber using Java, Python
Strong experience with PySpark for batch and stream processing deploying PySpark workloads to AWS EKS (Kubernetes)
Proficiency in working on Cloudera Hadoop ecosystem (HDFS, Hive, YARN)
Hands-on experience with ETL automation and validation framework
Strong knowledge of Oracle SQL and HiveQL
Familiarity with Red Hat OpenShift for container-based deployments
Proficient in creating Dockerfiles and managing container lifecycle
Solid understanding of AWS services like S3, Lambda, EKS, Airflow, and IAM
Experience with Airflow DAGs to orchestrate ETL jobs
Familiarity with CI/CD tools (e.g., Jenkins, GitLab CI)
Scripting knowledge in Bash, Python, and YAML
Version Control: GIT, Bitbucket, GitHub
Experience on automating BI reports e.g., Tableau dashboards and views validation
Hands on experience in Python for developing utilities for Data Analysis using Pandas, NumPy etc
Experience with mobile testing using perfecto, API Testing-SoapUI, Postman/Rest Assured, SAS Tools will be added advantage
Strong problem-solving and debugging skills
Excellent communication and collaboration abilities to lead and mentor a large techno-functional team across different geographical locations
Strong Acumen and great presentation skills
Able to work in an Agile environment and deliver results independently
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.