This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The Big Data Tech Delivery Lead is a senior level position responsible for establishing and implementing new or revised application systems and programs in coordination with the Technology Team. The overall objective of this role is to lead applications systems analysis and programming activities.
Job Responsibility:
Design and implement robust data pipelines and systems for data ingestion, processing, and storage
Develop, test, and maintain big data solutions using Java and related frameworks (e.g., Hadoop, Spark, Kafka)
Create and optimize algorithms and data processing logic to extract insights from large datasets
Work with large datasets, ensuring data quality, integrity, and security within big data ecosystems
Provide expertise in area of advanced knowledge of applications programming and plan assignments involving large budgets, cross functional project, or multiple projects
Develop application methodologies and standards for program analysis, design, coding, testing, debugging, and implementation
Utilize advanced knowledge of supported main system flows and comprehensive knowledge of multiple areas to achieve technology goals
Research and evaluate new big data technologies and AI/ML advancements to enhance system capabilities and performance
Influence and negotiate with senior leaders and communicate with external parties
Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency
Requirements:
10+ years relevant experience on building data engineering solutions for large scale Reporting and Data Warehouse implementations
8+ years experience in building enterprise data warehouse systems in finance sector is preferable
8+ years of relevant experience in Application Development for Enterprise
Knowledge on Big Data/Spark suite of products is preferable
Technical Skills - Java/J2EE, Hadoop, Spark, Hive, Impala, Kafka and Elastic along with focus on data analysis/analytics
Core Skills - 10+ years experience in handling Large Teams: IT Projects Design and Development
Enterprise Architecture
Data & Database Architecture
Project Management
Software Development Life Cycle: Risk Management
On/Off Shore large teams management
Ability to develop working relationships
Ability to manage multiple activities and changing priorities
Ability to work under pressure and to meet tight deadlines
Self-starter with ability to take the initiative and master new tasks quickly
Methodical, attention to detail
Bachelor’s degree/University degree or equivalent experience
Master’s degree preferred
What we offer:
medical, dental & vision coverage
401(k)
life, accident, and disability insurance
wellness programs
paid time off packages, including planned time off (vacation), unplanned time off (sick leave), and paid holidays
discretionary and formulaic incentive and retention awards