CrawlJobs Logo

Ab Initio Data Engineer

https://www.citi.com/ Logo

Citi

Location Icon

Location:
India, Chennai

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities.

Job Responsibility:

  • Ability to design and build Ab Initio graphs (both continuous & batch) and Conduct>it Plans
  • Build Web-Service and RESTful graphs and create RAML or Swagger documentations
  • Complete understanding and analytical ability of Metadata Hub metamodel
  • Strong hands on Multifile system level programming, debugging and optimization skill
  • Hands on experience in developing complex ETL applications
  • Good knowledge of RDBMS – Oracle, with ability to write complex SQL needed to investigate and analyze data issues
  • Strong in UNIX Shell/Perl Scripting
  • Build graphs interfacing with heterogeneous data sources – Oracle, Snowflake, Hadoop, Hive, AWS S3
  • Build application configurations for Express>It frameworks – Acquire>It, Spec-To-Graph, Data Quality Assessment
  • Build automation pipelines for Continuous Integration & Delivery (CI-CD), leveraging Testing Framework & JUnit modules, integrating with Jenkins, JIRA and/or Service Now
  • Build Query>It data sources for cataloguing data from different sources
  • Parse XML, JSON & YAML documents including hierarchical models
  • Build and implement data acquisition and transformation/curation requirements in a data lake or warehouse environment, and demonstrate experience in leveraging various Ab Initio components
  • Build Autosys or Control Center Jobs and Schedules for process orchestration
  • Build BRE rulesets for reformat, rollup & validation usecases
  • Build SQL scripts on database, performance tuning, relational model analysis and perform data migrations
  • Ability to identify performance bottlenecks in graphs, and optimize them
  • Ensure Ab Initio code base is appropriately engineered to maintain current functionality and development that adheres to performance optimization, interoperability standards and requirements, and compliance with client IT governance policies
  • Build regression test cases, functional test cases and write user manuals for various projects
  • Conduct bug fixing, code reviews, and unit, functional and integration testing
  • Participate in the agile development process, and document and communicate issues and bugs relative to data standards
  • Pair up with other data engineers to develop analytic applications leveraging Big Data technologies: Hadoop, NoSQL, and In-memory Data Grids
  • Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment
  • Perform other duties and/or special projects as assigned

Requirements:

  • Bachelor's degree in a quantitative field (such as Engineering, Computer Science, Statistics, Econometrics)
  • Minimum 5 years of extensive experience in design, build and deployment of Ab Initio-based applications
  • Expertise in handling complex large-scale Data Lake and Warehouse environments
  • Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities

Additional Information:

Job Posted:
March 22, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Ab Initio Data Engineer

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology/MCA
  • 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Join Inetum as a Data Engineer! At Inetum, we empower innovation and growth thro...
Location
Location
Portugal , Lisbon
Salary
Salary:
Not provided
https://www.inetum.com Logo
Inetum
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Teradata – advanced SQL and data warehousing
  • CONTROL-M – job scheduling and automation
  • UNIX – working in a UNIX environment (directories, scripting, etc.)
  • SQL (Teradata) – strong querying and data manipulation skills
  • Ab Initio – data integration and ETL development
  • DevOps – CI/CD practices and automation
  • Collaborative tools – GIT, Jira, Confluence, MEGA, Zeenea
Job Responsibility
Job Responsibility
  • Design, development, and optimization of data solutions that support business intelligence and analytics
  • Fulltime
Read More
Arrow Right

Lead Data Engineer

At Citi Fund Services is undergoing a major transformation effort to transform t...
Location
Location
United Kingdom , Belfast
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant years of hands-on experience in software development, with proven experience in data integration / data pipeline developments
  • Exceptional technical leader with a proven background in delivery of significant projects
  • Multi-year experience in Data integration development (Ab Initio, Talend, Apache spark, AWS Glue, SSIS or equivalent) including optimization, tuning and benchmarking
  • Multi-year experience in SQL Oracle, MSSQL and equivalents including optimization, tuning and benchmarking
  • Expertise with Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.) a significant plus
  • Strong understanding of Agile methodologies (Scrum, Kanban) and experience working in Agile teams
  • Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premises or public cloud (i.e., Tekton, Harness, Jenkins, etc.)
  • Demonstrable expertise in financial services considered a plus
  • Self-starter with the ability to drive projects independently and deliver results in a fast paced environment
Job Responsibility
Job Responsibility
  • Architect and develop enterprise-scale data pipelines using the latest data streaming technologies
  • Implement and optimize delivered solutions through tuning for optimal performance through frequent benchmarking
  • Develop containerised solutions capable of running in private or public cloud
  • Ensure solution is aligned to ci/cd tooling and standards
  • Ensure solution is aligned to observability standards
  • Effectively communicate technical solutions and artifacts to non-technical stakeholders and senior leadership
  • Contribute to the journey of modernizing existing data processors move to common and cloud
  • Collaborate with cross function domain experts to translate business requirements to scalable data solutions
What we offer
What we offer
  • 27 days annual leave (plus bank holidays)
  • A discretional annual performance related bonus
  • Private Medical Care & Life Insurance
  • Employee Assistance Program
  • Pension Plan
  • Paid Parental Leave
  • Special discounts for employees, family, and friends
  • Access to an array of learning and development resources
  • Fulltime
Read More
Arrow Right

Data Engineer

The Data Engineer is accountable for developing high quality data products to su...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • First Class Degree in Engineering/Technology (4-year graduate course)
  • 4 to 8 years' experience implementing data-intensive solutions using agile methodologies
  • Experience of relational databases and using SQL for data querying, transformation and manipulation
  • Experience of modelling data for analytical consumers
  • Ability to automate and streamline the build, test and deployment of data pipelines
  • Experience in cloud native technologies and patterns
  • A passion for learning new technologies, and a desire for personal growth
  • Excellent communication and problem-solving skills
  • ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica
  • Big Data: Experience of 'big data' platforms such as Hadoop, Hive or Snowflake for data storage and processing
Job Responsibility
Job Responsibility
  • Developing and supporting scalable, extensible, and highly available data solutions
  • Deliver on critical business priorities while ensuring alignment with the wider architectural vision
  • Identify and help address potential risks in the data supply chain
  • Follow and contribute to technical standards
  • Design and develop analytical data models
What we offer
What we offer
  • Programs and services for physical and mental well-being including access to telehealth options, health advocates, confidential counseling
  • Empowerment to manage financial well-being and help plan for the future
  • Fulltime
Read More
Arrow Right

Data Engineer

At Citi Fund Services is undergoing a major transformation effort to transform t...
Location
Location
United Kingdom , Belfast
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Substantial hands-on experience in software development, with proven experience in data integration / data pipeline developments
  • Proven experience in Data integration development (Ab Initio, Talend, Apache Spark, AWS Glue, SSIS or equivalent) including optimization, tuning and benchmarking
  • Proven experience in SQL Oracle, MSSQL and equivalents including optimization, tuning and benchmarking
  • Understanding of Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.) as significant plus
  • Understanding of Agile methodologies (Scrum, Kanban) and experience working in Agile teams
  • Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premises or public cloud (i.e., Tekton, Harness, Jenkins, etc.)
  • Demonstrable expertise in financial services considered a plus
  • Self-starter with the ability to drive projects independently and deliver results in a fast-paced environment.
Job Responsibility
Job Responsibility
  • Develop enterprise-scale data pipelines using the latest data streaming technologies
  • Optimize delivered solutions through tuning for optimal performance through frequent benchmarking
  • Develop containerised solutions capable of running in private or public cloud
  • Ensure solution is aligned to ci/cd tooling and standards
  • Ensure solution is aligned to observability standards
  • Contribute to the journey of modernizing existing data processors and move to common and cloud
  • Collaborate with cross function domain experts to translate business requirements to scalable data solutions.
  • Fulltime
Read More
Arrow Right

Solution Architect

The Solution Architect role involves driving the architectural transformation fo...
Location
Location
Ireland , Dublin
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Significant experience in Data modeling, Data lineage analysis, Operational reporting, preferably in a global organization
  • Proven architecture experience in solutioning of horizontally scalable, highly available, highly resilient data distribution platforms
  • Proficient in message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
  • Strong analytic skills related to working with unstructured datasets
  • Extensive experience with Data Integration patterns
  • Extensive experience with Real/Near Real time streaming patterns
  • Strong background in Data Management, Data Governance, Transformation initiatives preferred
  • Preferred Experience/Familiarity with one or more of these tools: Big data platforms - Hadoop, Apache Kafka, Relational SQL, NoSQL, and Cloud Native databases - Postgres, Cassandra, Snowflake, Experience with data pipeline and orchestration tools - Azkaban, Luigi, or Airflow, Experience with stream-processing engines - Apache Spark, Apache Storm, or Apache Flink and ETL tools - Talend, Ab Initio, Experience with Data Analytics/visualization tools - Looker, Mode, or Tableau
Job Responsibility
Job Responsibility
  • Re-engineering the interaction of incoming and outgoing data flows from the Core Accounts DDA platform to Reference Data platforms, Data Warehouse, Data Lake as well as other local reporting systems which consume data from Core Accounts
  • Drive data architecture and roadmap for eliminating non-strategic point-to-point connections and batch handoffs
  • Define canonical data models for key entities and events related to Customer, Account, Core DDA in line with the Data Standards
  • Assess opportunities to simplify/rationalize/refactor the existing database schemas paving way for modularization of the existing stack
  • Provide technical guidance to Data Engineers responsible for designing an Operational Data Store for intra-day and end-of-day reporting
  • Implementing data strategies and developing logical and physical data models
  • Formulate efficient approach to rationalize and formulate strategy to migrate reports
  • Build and nurture a strong engineering organization to deliver value to internal and external clients
  • Acts as SME to senior stakeholders in business, operations, and technology divisions across upstream and downstream Organizations
  • Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users
What we offer
What we offer
  • Competitive base salary (annually reviewed)
  • Hybrid working model (up to 2 days working at home per week)
  • Additional benefits supporting you and your family
  • Fulltime
Read More
Arrow Right

Data Science Intern

Designs, develops, and applies programs, methodologies, and systems based on adv...
Location
Location
United States , Ft. Collins
Salary
Salary:
35.00 - 46.00 USD / Hour
https://www.hpe.com/ Logo
Hewlett Packard Enterprise
Expiration Date
May 26, 2026
Flip Icon
Requirements
Requirements
  • Working towards a Bachelor's and/or Master's degree with a focus in Data Science, Computer Science, Computer Engineering, Software development, or other IT related field
  • Basic knowledge of data science methodologies
  • Basic understanding of business requirements and data science objectives
  • Basic data mapping, data transfer and data migration skills
  • Basic understanding of analytics software (eg. R, SAS, SPSS, Python)
  • Basic knowledge of machine learning, data integration, and modeling skills and ETL tools (eg. Informatica, Ab Initio, Talend)
  • Basic communication and presentation skills
  • Basic data knowledge of relevant data programming languages
  • Basic knowledge of data visualization techniques
Job Responsibility
Job Responsibility
  • Participates in the analysis and validation of data sets/solutions/user experience
  • Aids in the development, enhancement and maintenance of a client's metadata based on analytic objectives
  • May load data into the infrastructure and contributes to the creation of the hypothesis matrix
  • Prepares a portion of the data for the Exploratory Data Analysis (EDA) / hypotheses
  • Contributes to building models for the overall solution, validates results and performance
  • Contributes to the selection of the model that supports the overall solution
  • Supports the research, identification and delivery of data science solutions to problems
  • Supports visualization of the model's insights, user experience and configuration tools for the analytics model
What we offer
What we offer
  • Comprehensive suite of benefits that supports physical, financial and emotional wellbeing
  • Specific programs catered to helping reach career goals
  • Unconditional inclusion and flexibility to manage work and personal needs
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.