CrawlJobs Logo

Market Data DevOps Engineer

blackrock.com Logo

BlackRock Investments

Location Icon

Location:
India , Gurgaon

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for market Data DevOps Engineer to join the Production Engineering team, working within the Aladdin Platform Group. A person who has a passion for solving technical challenges, learning new technology tools and improving procedures & processes. This role involves identifying trends and being a problem solver with developed skills for quickly learning new technologies and proprietary systems. You will be a technical product expert during product implementation projects, organize and implement the system deployment plans, while automating infrastructure management and developing reports in compliance with applicable project requirements.

Job Responsibility:

  • Build, update, and maintain tooling to deploy, upgrade, monitor, and maintain market data infrastructure hosted both on-prem and in public cloud
  • Develop or modify solutions to automate tasks and meet business requirements
  • Help to suggest and design both short-term and long-term engineering solutions to improve existing operational workflows
  • Build relationships with internal and external interested parties to understand their roles and requirements and clearly communicate with them on progress, challenges, and questions
  • Ensure resilience and stability through quality configuration reviews, regression testing, and level two production support for owned tools
  • Provide on-call support on a rotational basis including weekend coverage
  • Collaborate with team members in a multi-office, multi-time zone environment

Requirements:

  • A degree specializing in Computer Science, Technology, MIS, Mathematics, Physics, Engineering or similar
  • 4-6 years of proven experience in an engineering role
  • Experience supporting market data server and desktop technologies & platforms like LSEG RTDS / Refinitiv TREP and DACS, ITRS Geneos, Bloomberg B-PIPE or SAPI, Tradeweb MDServer, LSEG QADirect, S&P Global Xpressfeed, Factset / Bloomberg / Eikon / Wind terminals
  • Experience in a programming or scripting language such as Python, Java or PowerShell and their respective testing modules
  • An understanding of Agile work environments, such as knowledge of Azure DevOps, GIT and CI/CD
  • A Strong understanding of TCP/IP and networking concepts
  • Experience solving clients' technical issues, and working with engineering teams, sales, professional services, and customers
  • Excellent communication and presentation skills
  • Ability to work with teams across geography
  • Ability to build and deliver process recommendations, documentation for internal wiki pages, and presentations

Nice to have:

  • Experience with both Linux and Windows server environments
  • Experience with Kubernetes environments
  • Experience with relational database systems such as SQL Server, PostgreSQL, MySQL
  • Experience with market data OneTick platform
  • Previous experience working in the Financial Services or Technology industry
  • Experience with configuration management or Infrastructure as Code tools like Terraform, Ansible or Cutover
What we offer:
  • A strong retirement plan
  • Tuition reimbursement
  • Comprehensive healthcare
  • Support for working parents
  • Flexible Time Off (FTO)

Additional Information:

Job Posted:
February 20, 2026

Expiration:
April 01, 2026

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Market Data DevOps Engineer

Data Engineer

We are looking for a seasoned Data Engineer to join our MarTech and Data Strateg...
Location
Location
United States
Salary
Salary:
Not provided
zionandzion.com Logo
Zion & Zion
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-6 years experience in a data engineering role
  • Significant experience and (preferably) certified in public cloud products: GCP Cloud Architect / Data Engineer or equivalent on AWS
  • Experience with tools like dbt
  • Familiar with ETL/ELT and (nice to have) reverse ETL platforms
  • Experience on a cloud-based or Business Intelligence project (as a technical project manager, developer or architect)
  • DevOps capabilities (CI/CD, Infrastructure as code, Docker, etc.)
  • Experience leveraging digital data in the cloud for marketing activations
  • Excellent verbal and written communication skills and comfortable working with both marketing and technical teams
  • Client-facing experience for detailed technical specifications discussions
  • Fluent in SQL, Python or R
Job Responsibility
Job Responsibility
  • Work with internal and external teams to design and implement technical architecture in order to facilitate the advanced activation of data
  • Interacting with 3rd party MarTech solutions (Google Analytics, Google Marketing Platform, Data Management Platforms, Tag Management Solutions, Adobe Analytics, Clouds, Customer Data Platforms, etc.)
  • Come up with creative solutions to integrate data from a variety of sources into platforms and data warehouses
  • Work with internal teams to specify data processing pipelines (database schemas, integrity constraints, delivery throughput) for use case activations and implement it in the cloud
  • Work with internal data science team to scope and check typical machine learning and AI project requirements
  • Work with data visualization teams to design and implement tables to help power complex dashboards
Read More
Arrow Right

DevOps Engineer

As a DevOps Engineer in the Global Markets division of a leading investment bank...
Location
Location
Canada , Mississauga
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of experience in DevOps, Site Reliability Engineering (SRE), or Cloud Infrastructure Engineering within a financial services or capital markets environment
  • Strong expertise in CI/CD automation using Tekton and Harness, with a deep understanding of pipeline orchestration and GitOps workflows
  • Proficiency in Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, and Ansible
  • Hands-on experience with containerization (Docker) and Kubernetes orchestration (K8s, OpenShift)
  • Knowledge of cloud environments (AWS, Azure, GCP) with hands-on experience in provisioning and optimizing cloud resources
  • Experience integrating monitoring, observability, and logging solutions (Prometheus, Grafana, ELK Stack, Splunk, Datadog)
  • Strong scripting and automation skills using Python, Shell scripting, or Go for process automation
  • Understanding of networking concepts related to low-latency trading environments, including service mesh architectures
  • Basic understanding of certificates, networking and load balancers
  • Experience troubleshooting build, deployment, and runtime issues in complex distributed systems
Job Responsibility
Job Responsibility
  • Develop, maintain, and optimize CI/CD pipelines using Tekton and Harness for automating deployments of trading and risk applications
  • Implement Infrastructure-as-Code (IaC) solutions using Terraform and Ansible to manage cloud and on-prem environments
  • Work with containerization and orchestration technologies like Kubernetes, Docker, and OpenShift to deploy and manage microservices
  • Monitor, troubleshoot, and optimize build, test, and deployment workflows to enhance reliability and performance of global trading applications
  • Integrate observability, logging, and monitoring tools such as Prometheus, Grafana, ELK Stack, and Datadog for real-time infrastructure monitoring
  • Automate cloud provisioning, scaling, and resource optimization in multi-cloud environments (AWS, Azure, GCP)
  • Implement DevSecOps best practices, including security scanning, policy enforcement, and compliance automation within CI/CD pipelines
  • Work closely with trading technology, quant, and risk teams to optimize DevOps workflows for real-time data processing and algorithmic trading
  • Participate in incident response and root cause analysis, ensuring high availability and rapid recovery of trading services
  • Stay updated with emerging DevOps and cloud-native technologies, contributing to innovation and process improvements
  • Fulltime
Read More
Arrow Right

DevOps Engineer with Application Support

This role combines DevOps engineering and application support responsibilities w...
Location
Location
Hungary , Budapest
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BSc/MSc degree in Engineering or Computer Science preferred
  • Proficiency in English (written and spoken) is essential
  • Proficiency in Hungarian is a plus
  • Minimum of 3-4 years of experience in a similar field
  • Develop, maintain, and optimize CI/CD pipelines using TeamCity
  • Manage MQA's build infrastructure by maintaining build scripts and Incredibuild environment
  • Perform vulnerability scans and keep software/hardware stack up to date
  • Administer Linux/Windows servers and databases (MSSQL, MongoDB)
  • Oversee containerization and orchestration (Docker and Kubernetes desirable)
  • Implement and manage logging, monitoring (ELK stack, Nagios), and load balancing solutions
Job Responsibility
Job Responsibility
  • Develop, maintain, and optimize CI/CD pipelines using TeamCity
  • Manage MQA's build infrastructure by maintaining build scripts and Incredibuild environment
  • Perform vulnerability scans and keep software/hardware stack up to date
  • Administer Linux/Windows servers and databases (MSSQL, MongoDB)
  • Oversee containerization and orchestration
  • Implement and manage logging, monitoring (ELK stack, Nagios), and load balancing solutions
  • Utilize scripting (Python, Bash) for automation
  • Administer Confluence and Jira
  • Provide first and second-line application support, including market data services and Excel add-in tools
  • Contribute to maintaining the development tools and infrastructure used by quants
What we offer
What we offer
  • Cafeteria Program
  • Home Office Allowance
  • Paid Parental Leave Program (maternity and paternity leave)
  • Private Medical Care Program and onsite medical rooms
  • Pension Plan Contribution to voluntary pension fund
  • Group Life Insurance
  • Employee Assistance Program
  • Access to learning and development programs, online course libraries and upskilling platforms
  • Flexible work arrangements
  • Career progression opportunities across geographies and business lines
  • Fulltime
Read More
Arrow Right
New

Data Platform Engineering Director

The Lead – Data Platform Engineering will architect and build the data infrastru...
Location
Location
India , Gurgaon
Salary
Salary:
Not provided
blackrock.com Logo
BlackRock Investments
Expiration Date
February 28, 2026
Flip Icon
Requirements
Requirements
  • 10+ years in backend, data, or platform engineering (with at least 3–5 years dedicated to data infrastructure and cloud-native architecture)
  • Proven track record of building internal data platforms or developer tools at scale is highly desirable
  • A Bachelor’s degree or equivalent experience in Computer Science, Engineering, or related discipline is a must
  • Proficiency in data architecture protocols and previous involvement with financial or investment data systems is beneficial
  • Distributed systems expertise: Profound understanding of distributed data systems and advanced data technologies (e.g. Spark, Kafka, Airflow) for large-scale data processing
  • DevOps & Cloud proficiency: Expertise in implementing CI/CD pipelines, cloud orchestration, and infrastructure-as-code(Terraform, CDK) on modern cloud platforms (Azure preferred, also AWS/GCP)
  • Hands-on experience with containerization (Docker, Kubernetes) and cloud infrastructure automation is essential
  • Data pipeline development: Proficiency in designing and optimizing scalable data pipelines for data ingestion, transformation, and delivery
  • Experience with modern ETL/ELT frameworks (e.g. Apache Airflow, dbt, Azure Data Factory) for workflow orchestration is preferred
  • Programming & data modelling: Strong programming skills in Python, SQL, Scala with solid data modelling and database design experience
Job Responsibility
Job Responsibility
  • Architect and build the data infrastructure that underpins our markets & investment analytics
  • Design scalable data frameworks, integrating firm-wide data assets, and enabling teams to deliver data solutions efficiently and safely
  • Spearhead the creation of our next-generation data ecosystem using DevOps practices, helping shape analytics that power its critical market and investment systems
  • Ensure a reliable, scalable, and secure data platform
What we offer
What we offer
  • Strong retirement plan
  • Tuition reimbursement
  • Comprehensive healthcare
  • Support for working parents
  • Flexible Time Off (FTO)
  • Fulltime
!
Read More
Arrow Right

Senior C# Developer – Pricing Engines

Join a fast-paced trading tech environment where your C# code powers real-time p...
Location
Location
Netherlands , Amsterdam
Salary
Salary:
Not provided
levy-professionals.com Logo
Levy Professionals
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Extensive experience with C#/.NET in complex back-end or trading systems
  • Proven low-latency, high-throughput engineering expertise (ideally in pricing or market data)
  • Deep knowledge of concurrency, async IO, performance tuning, profiling, TCP/binary protocols
  • Strong understanding of financial markets: FX spot/forwards/swaps, bonds, IR swaps, curves, PV/DV01
  • Experience working with front-office stakeholders
  • Clear communication and the ability to simplify complex concepts
Job Responsibility
Job Responsibility
  • Build high-performance C# services on .NET Core for real-time pricing and risk
  • Refactor and migrate existing C# and Java components into a unified .NET Core platform
  • Optimise multithreading, memory, GC and network performance on bare-metal servers
  • Implement curve handling and curve location for FX, bonds and rates
  • Integrate with trading tools, market data feeds and external pricing/execution platforms
  • Ensure safe, efficient price distribution via internal messaging
  • Contribute to automated testing, CI/CD and observability in a 24/7 trading setup
  • Work in an agile DevOps team with shared support responsibilities
Read More
Arrow Right
New

Staff Software Engineer, Data Infrastructure

At Docker, we make app development easier so developers can focus on what matter...
Location
Location
United States , Seattle
Salary
Salary:
195400.00 - 275550.00 USD / Year
docker.com Logo
Docker
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of software engineering experience with 3+ years focused on data engineering and analytics systems
  • Expert-level experience with Snowflake including advanced SQL, performance optimization, and cost management
  • Deep proficiency in DBT for data modeling, transformation, and testing with experience in large-scale implementations
  • Strong expertise with Apache Airflow for complex workflow orchestration and pipeline management
  • Hands-on experience with Sigma or similar modern BI platforms for self-service analytics
  • Extensive AWS experience including data services (S3, Redshift, EMR, Glue, Lambda, Kinesis) and infrastructure management
  • Proficiency in Python, SQL, and other programming languages commonly used in data engineering
  • Experience with infrastructure-as-code, CI/CD practices, and modern DevOps tools
  • Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience
  • Proven track record designing and implementing large-scale distributed data systems
Job Responsibility
Job Responsibility
  • Define and drive the technical strategy for Docker's data platform architecture, establishing long-term vision for scalable data systems
  • Lead design and implementation of highly scalable data infrastructure leveraging Snowflake, AWS, Airflow, DBT, and Sigma
  • Architect end-to-end data pipelines supporting real-time and batch analytics across Docker's product ecosystem
  • Drive technical decision-making around data platform technologies, architectural patterns, and engineering best practices
  • Establish technical standards for data quality, testing, monitoring, and operational excellence
  • Design and build robust, scalable data systems that process petabytes of data and support millions of user interactions
  • Implement complex data transformations and modeling using DBT for analytics and business intelligence use cases
  • Develop and maintain sophisticated data orchestration workflows using Apache Airflow
  • Optimize Snowflake performance and cost efficiency while ensuring reliability and scalability
  • Build data APIs and services that enable self-service analytics and integration with downstream systems
What we offer
What we offer
  • Freedom & flexibility
  • fit your work around your life
  • Designated quarterly Whaleness Days plus end of year Whaleness break
  • Home office setup
  • we want you comfortable while you work
  • 16 weeks of paid Parental leave
  • Technology stipend equivalent to $100 net/month
  • PTO plan that encourages you to take time to do the things you enjoy
  • Training stipend for conferences, courses and classes
  • Equity
  • Fulltime
Read More
Arrow Right

Senior Engineer, Software

Our team is searching for a Sr Software Engineer to work of other software engin...
Location
Location
United States , Bellevue; Overland Park
Salary
Salary:
113600.00 - 205000.00 USD / Year
https://www.t-mobile.com Logo
T-Mobile
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Master’s degree in information systems and management, Electronic Engineering, Information Technology and Communications Engineering, Management Information Systems, Computer Science or related, and 3 years of relevant work experience
  • Bachelor’s degree in information systems and management, Electronic Engineering, Information Technology and Communications Engineering, Management Information Systems, Computer Science or related, and 5 years of relevant work experience
  • 4-7 years Technical engineering experience
  • Compensation domain experience required (4-6 years preferred)
  • Proven experience in programming languages: Shell Scripting, and SQL (T-SQL, PLSQL) in building database applications that operate at scale
  • Experience in testing, quality and change management methodologies
  • Expert knowledge of relational database design and support for on-prem and on cloud services (Oracle, MS SQL, RDS)
  • Experience with ETL methodologies and tools (PowerCenter)
  • Coding in a DevOps environment within a SCRUM framework to deliver Continuous Integration Continuous Delivery automations for market growth
  • At least 18 years of age
Job Responsibility
Job Responsibility
  • Drives engineering projects by developing software solutions
  • conducting tests and inspections
  • preparing reports and calculations
  • Expected to supervise base and associate level engineers as needed
  • Understands system protocols, how systems operate and data flows
  • Expected to independently develop a full software stack
  • Interact with system engineers to define system requirement and/or necessary requirements for automation
  • Contributes to designs to implement new ideas which utilize new frameworks to improve an existing or new system/process/service
  • Review existing designs and processes to highlight more efficient ways to complete existing workload more optimally through industry perspectives
  • Understands the creation of company IPR
What we offer
What we offer
  • Competitive base salary and compensation package
  • Annual stock grant
  • Employee stock purchase plan
  • 401(k)
  • Access to free, year-round money coaches
  • Medical, dental and vision insurance
  • Flexible spending account
  • Employee stock grants
  • Employee stock purchase plan
  • Paid time off
  • Fulltime
Read More
Arrow Right

Principle Data Platform Engineer

Imagine being at the forefront of a revolution powered by Data and AI (Artificia...
Location
Location
United States , Redmond
Salary
Salary:
139900.00 - 274800.00 USD / Year
https://www.microsoft.com/ Logo
Microsoft Corporation
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Science or related technical field AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience
  • Master's Degree in Computer Science or related technical field AND 8+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR Bachelor's Degree in Computer Science or related technical field AND 12+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience
  • 6+ years of software development lifecycle experience, including 2+ years in a lead role managing big data platforms
  • 2+ years of Infrastructure-as-Code (Terraform, Bicep, ARM)
  • 2+ years of CI/CD pipelines and DevOps automationIdentity & access management (RBAC, PIM, conditional access)
  • 2+ years of Governance frameworks (data classification, policy enforcement, compliance controls)
  • Observability (OpenTelemetry, monitoring, logging, SLOs)
  • Incident management and reliability engineering practices
Job Responsibility
Job Responsibility
  • Develop and implement a federated infrastructure strategy for marketing data focused on cost optimization and scalability
  • Develop monitoring systems and processes for Data Platform infrastructure, pipelines, usage, and access patterns
  • Build capabilities for data discovery, access management, policy enforcement, and lineage tracking, aligned with the DE data platform vision
  • Create and deploy frameworks and tools for data quality measurement and monitoring to deliver secure, sustainable, high-performing, and reliable marketing data for consumer needs
  • Design and roll out data storage framework and solution in line with medallion architecture and automated role-based access controls across all access layers
  • Establish engineering excellence through best practices, coding standards, and robust code management tooling
  • Collaborate with Data Product Engineering, Data Ops, and Partner Engineering teams to deliver end-to-end platform solutions
  • Partner with Fabric, Purview, Azure ML teams to influence product roadmaps and address feature gaps
  • Implement shared tools for cross-tenant Azure data intake and publishing
  • Deploy capabilities for data classification, tagging, retention, archival, and deletion to meet privacy policy requirements
  • Fulltime
Read More
Arrow Right