This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
The COO Technology group delivers technology solutions for the Chief Operating Office, supporting operations, control executives, strategic execution, business continuity, resiliency, data services, regulatory relations, customer experience, enterprise shared services, supply chain management, and corporate properties. Our mission is to modernize and optimize technology platforms for these critical functions. We are seeking a highly motivated Senior Engineering Manager to lead our Data Engineering organization as we modernize our data platforms and products on Google Cloud Platform (GCP). You will own strategy and execution for enterprise data migration and build-out on GCP—including streaming and batch pipelines, data lake/lakehouse, governance, and data products—while managing high-performing teams of data engineers, platform engineers, and SRE/DevOps engineers. This role partners closely with architecture, cybersecurity, risk, and line-of-business stakeholders to deliver secure, compliant, scalable, and cost-efficient data capabilities across the firm.
Job Responsibility:
Lead & Develop Teams: Manage, coach, and grow multiple agile teams (data engineering, platform engineering, SRE/DevOps, QA) to deliver high-quality, resilient data capabilities
build a culture of talent development, engineering excellence, psychological safety, and continuous improvement
Own the GCP Data Platform Roadmap: Define and drive the roadmap for GCP-based data platforms (BigQuery, Dataflow/Apache Beam, Pub/Sub, Dataproc/Spark, Cloud Storage, Cloud Composer/Airflow, Dataplex, Data Catalog)
Migrate Legacy Workloads: Lead the migration of legacy data pipelines, warehouses, and integration workloads to GCP (including CDC, batch & streaming, API-first data products, and event-driven architectures)
Engineering & Architecture: Partner with enterprise, data, and security architects to align on target state architecture, data modeling (dimensional, Data Vault), and domain-driven data products
establish and enforce DataOps and DevSecOps practices (CI/CD, IaC/Terraform, automated testing, observability)
Security, Risk & Compliance: Embed defense-in-depth—VPC Service Controls, private IP, CMEK/Cloud KMS, DLP, IAM least privilege, tokenization, data masking, and lineage
ensure adherence to financial services regulations and standards (e.g., SOX, GLBA, BCBS 239, model governance)
Reliability & Cost Management: Define SLOs/SLIs, runbooks, incident response, capacity planning, and performance tuning for BigQuery/Dataflow/Spark workloads
optimize cost and performance via partitioning/clustering, workload management, autoscaling, and right-sizing
Stakeholder & Vendor Management: Influence senior technology leaders and business stakeholders
translate business needs into platform roadmaps and measurable outcomes
manage budgets, resource plans, and strategic vendor/partner engagements
Enablement & Adoption: Scale onboarding of lines of business to the platform, including templates, blueprints, guardrails, and self-service developer experience
Requirements:
6+ years of Data Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
3+ years of management or leadership experience
3+ years of direct report management experience of multi-disciplinary engineering teams including, but not limited to, assigning tasks, conducting performance evaluations and determining salary adjustments
6+ years hands-on in data engineering and platform build-outs using modern stacks and automation
4+ years of production experience on GCP with several of: BigQuery, Dataflow/Apache Beam, Pub/Sub, Dataproc/Spark, Cloud Storage, Cloud Composer/Airflow, Dataplex, Data Catalog
6+ years of experience with programming and data skills: SQL plus one or more of Python, Java, Scala
solid understanding of data modeling and data quality frameworks
4+ years of experience with DevOps/DataOps proficiency: CI/CD, Terraform/IaC, automated testing, observability, GitOps
6+ years of experience with large-scale data platform migrations or modernizations in regulated environments
3+ years leading AI/ML and Generative AI data initiatives
Nice to have:
Certifications: Google Professional Data Engineer and/or Google Professional Cloud Architect (plus: CKA/CKAD, OpenShift, Azure/AWS exposure)
Experience with dbt, Great Expectations, OpenLineage/Marquez, DataHub, or similar data quality/lineage/governance tools
Experience with Kafka/Confluent and event-driven architectures
Databricks on GCP and/or Vertex AI (MLOps) exposure
Security-by-design: IAM, KMS/CMEK, VPC Service Controls, private services, network segmentation, DLP, and sensitive data handling (PII/PCI)
Financial services domain knowledge (e.g., risk reporting, liquidity, AML/fraud, trading, consumer banking) and familiarity with SOX, GLBA, BCBS 239, model risk management practices
Track record driving change management in large, matrixed organizations and establishing platform guardrails and golden paths at scale
Communicate crisply with engineers through executives
create clear decision frameworks and status for senior leadership
Operate effectively in a matrix
influence without direct authority and build trusted partnerships across Technology, Risk, and Business
Thrive in ambiguity
facilitate structured problem-solving to converge on pragmatic solutions
Make high-quality decisions quickly with a results-oriented, collaborative leadership style
Serve as an internal thought leader on GCP data platforms, data products, and modern engineering
What we offer:
Health benefits
401(k) Plan
Paid time off
Disability benefits
Life insurance, critical illness insurance, and accident insurance