CrawlJobs Logo

Data Platform Architect

gofundme.com Logo

GoFundMe

Location Icon

Location:
United States , San Francisco

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

253500.00 - 380500.00 USD / Year

Job Description:

We are seeking a Data Platform Architect to join our Data Platform team. This role will be pivotal in modernizing our existing platform and building out a scalable data, reporting and AI architecture to provide scale, enhance product offerings, and meet new business requirements. You will design the blueprint for a modern cloud-native data ecosystem, ensuring it is robust, secure, and future-ready. By establishing standards, optimizing performance, and partnering cross-functionally, you will enable GoFundMe to harness data as a strategic asset that powers insights, innovation, and growth.

Job Responsibility:

  • Modernize, optimize and evolve a cloud-native data stack into a scalable, secure and high performing ecosystem such as SQL & NoSQL repositories for diverse workloads
  • Leverage modern data platform principles including lakehouse, data fabric and data mesh architectures to improve scalability, decentralization, governance, and interoperability
  • Design Scalable Architectures: Build flexible architectures to support analytics, real-time reporting, AI/ML Ops pipelines, feature stores, and graph-based data relationships
  • Unified Data Platform: Design high performing and cost effective solutions that handle both structured and unstructured data at high volume and high velocity
  • Establish Modeling Standards: Define enterprise-wide standards for data modeling, semantic layers, and golden datasets to enable consistency and reuse
  • Advance Governance & Quality: Implement frameworks for data lineage, quality, privacy, and security (GDPR, CCPA, PII protections)
  • Enable Personalization & Insights: Architect data capabilities that fuel advanced recommendations, relationship mapping, and connected user experiences
  • Evaluate & Integrate Technologies: Guide adoption of best-fit technologies to support evolving use cases in data science, AI, and scalable analytics
  • Partner Cross-Functionally: Collaborate with engineering, product, analytics, and data science teams to translate business vision into technical solutions
  • Mentor & Lead: Provide architectural leadership, coaching engineers on best practices while influencing strategic platform decisions

Requirements:

  • 10+ years of experience in data architecture, platform engineering, or data engineering, with at least 3+ in a senior or architect role
  • Proven expertise with modern cloud-native data stacks including Snowflake, Databricks, MDM, Data Catalogs and CDPs
  • Hands-on experience with Looker or other BI/semantic modeling tools
  • Strong understanding of graph-based data models, recommendation systems, and personalization architectures
  • Expertise in data modeling, metadata management, and semantic layers
  • Proficiency with streaming/event-driven data pipelines and real-time processing (Kafka, Pub/Sub, Spark Streaming, etc.)
  • Familiarity with AI/ML pipelines, feature stores, and unstructured/structured data integration
  • Strong foundation in data governance, lineage, cataloging, and compliance frameworks
  • Excellent communication and leadership skills to align technical direction across diverse teams
What we offer:
  • Make an Impact: Be part of a mission-driven organization making a positive difference in millions of lives every year
  • Innovative Environment: Work with a diverse, passionate, and talented team in a fast-paced, forward-thinking atmosphere
  • Collaborative Team: Join a fun and collaborative team that works hard and celebrates success together
  • Competitive Benefits: Enjoy competitive pay and comprehensive healthcare benefits
  • Holistic Support: Enjoy financial assistance for things like hybrid work, family planning, along with generous parental leave, flexible time-off policies, and mental health and wellness resources to support your overall well-being
  • Growth Opportunities: Participate in learning, development, and recognition programs to help you thrive and grow
  • Commitment to DEI: Contribute to diversity, equity, and inclusion through ongoing initiatives and employee resource groups
  • Community Engagement: Make a difference through our volunteering program
  • equity

Additional Information:

Job Posted:
December 08, 2025

Employment Type:
Fulltime
Work Type:
Hybrid work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Platform Architect

IT Data Platform Architect

As an IT Data Platform Architect, you will be instrumental in designing and impl...
Location
Location
United States , Charlotte
Salary
Salary:
Not provided
brightspeed.com Logo
Brightspeed
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field
  • 8+ years of relevant experience in data platform architecture
  • Experience with containerization technologies
  • Demonstrated ability in building and managing data platforms with integrated AI/ML capabilities
  • Strong knowledge and experience in GCP, with familiarity in other cloud platforms
  • Have extensive experience in data architecture, especially in supporting AI/ML applications
  • Demonstrate expertise in infrastructure as code, CI/CD practices, and troubleshooting across various technical domains
  • Show strong ability in handling streaming data applications
  • Can effectively communicate technical concepts to diverse audiences
  • Are a proactive problem-solver with a keen eye for improving system architectures
Job Responsibility
Job Responsibility
  • Develop a high-performance data architecture to support large-scale data processing and AI/ML analytics
  • Lead effort to implement infrastructure as code using tools like Terraform and Ansible to automate the deployment of both infrastructure and applications
  • Design CI/CD integrations using GitHub, GitHub Actions, Jenkins, ensuring smooth deployment on GCP
  • Create, update, and maintain comprehensive documentation, including procedural/process guides and infrastructure topology diagrams
  • Stay updated with technological advancements, advocating for and implementing necessary changes and updates to our systems
  • Proactively identify improvement areas in infrastructure architecture and develop plans for enhancements
  • Work with streaming data applications, ensuring robust data flow and integration
  • Possess knowledge of containerization technologies and orchestration tools to manage and scale applications effectively
  • Build reusable code, components, and services, focusing on versioning, reconciliation, and robust exception handling
  • Communicate complex technical concepts effectively to both technical and non-technical stakeholders
What we offer
What we offer
  • competitive medical, dental, vision, and life insurance
  • employee assistance program
  • 401K plan with company match
  • host of voluntary benefits
  • Fulltime
Read More
Arrow Right

Cloud Data Platform Architect

Circle K is transforming our Data Engineering and BI platform to match our busin...
Location
Location
United States of America , Charlotte
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of professional experience in designing & architecting Data & Analytics solutions, with a focus on Azure-based platforms and enterprise-scale systems
  • Working experience in designing and architecting solutions that leverage Databricks and Snowflake
  • Hands-on experience with Azure Cloud services (Azure Synapse/SQL Server, ADF or close equivalents)
  • Working experience with Microsoft Power BI (Power BI Platform, AAS/Tabular) integration with Azure-based Platforms
  • Expert understanding of relational databases, Data Warehouse & Data Lake modeling techniques & concepts, ETL/ELT processing patterns, and Big Data technologies
  • Practical experience in designing systems to handle large data volumes
  • Practical experience in designing systems for large-scale data processing with a focus on Azure performance optimization and cost management
  • Working knowledge of Python, PySpark, SQL & T-SQL
  • Working experience in designing and architecting solutions that comply with data security industry standards and regulations, including RBAC, data encryption (GDPR, PCI, etc.), and monitoring
  • Microsoft Azure Certification required
Job Responsibility
Job Responsibility
  • Designing, building, and maintaining robust data platforms and solutions on Azure
  • Optimizing data delivery and ensuring the architecture aligns with business objectives
  • Leading architectural decisions and establishing governance standards
  • Collaborating across teams to ensure seamless data flows and scalable solutions
  • Driving the adoption and usage of Azure Databricks, Snowflake, Microsoft Fabric, and Power BI in the data platform
  • Performing architectural assessments and defining solutions to produce detailed design documents
  • Providing technical direction on Azure platform services
  • Mentoring Data Engineering and Data Science teams
  • Providing technical support for platform performance tuning and optimization activities
  • Participating in the creation and maintenance of technical roadmaps
  • Fulltime
Read More
Arrow Right

Data Intelligence Data Architect

We are seeking a visionary Data Architect to lead the design, governance, and op...
Location
Location
Serbia , Belgrade
Salary
Salary:
Not provided
everseen.ai Logo
Everseen
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years in data architecture, data engineering, or enterprise data management
  • Strong experience in data integration architecture across complex systems
  • Expertise in data modeling (conceptual, logical, physical) and database technologies
  • Strong knowledge of cloud data platforms (AWS, Azure, GCP) and integration tools
  • Familiarity with data governance frameworks and regulatory compliance
  • Proficiency in SQL and Python for building data pipelines, performing data transformations, and implementing automation tasks
Job Responsibility
Job Responsibility
  • Develop and maintain the enterprise data architecture blueprint aligned with business strategy and AI product roadmaps
  • Define enterprise-wide data models, taxonomies, and standards for consistent data usage
  • Collaborate with business stakeholders to identify data-driven revenue opportunities including APIs, data products, and new service offerings
  • Design and oversee data integration solutions (ETL/ELT, APIs, streaming, event-driven architecture) across applications, platforms, and business units
  • Enable real-time and batch data flows to support operational and analytical systems
  • Ensure data accessibility across business units and external partners while adhering to data sovereignty and compliance laws
  • Implement data governance policies covering metadata management, data lineage, access control, and retention
  • Define data quality metrics and oversee data cleansing and validation initiatives
  • Define data stewardship roles and accountability structures
  • Select, implement, and manage enterprise data platforms (data lakes, API gateways, event streaming platforms)
  • Fulltime
Read More
Arrow Right

Data Cloud Platform Architect

Ivy Partners is a Swiss consulting firm that supports businesses in their strate...
Location
Location
Portugal , Porto
Salary
Salary:
Not provided
ivy.partners Logo
IVY Partners
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Deep expertise in AWS data services like Lambda, Glue, Step Functions, and Redshift
  • Exposure to Azure data services
  • Extensive experience with Infrastructure as Code (IaC) using Terraform and CloudFormation
  • Ability to define and enforce data governance and security standards
  • Experience leading large-scale data migration and optimization projects
  • Strong programming skills in Python and SQL, with a history of prototyping and setting coding standards
  • Experience with Iceberg tables and managing large datasets efficiently
  • Proficiency in designing scalable and efficient data solutions on AWS, following best practices for cloud architecture and infrastructure
  • Experience with orchestration tools such as Apache Airflow and AWS Step Functions
  • Knowledge of ETL tools and experience working with large volumes of data, preferably with Kafka
Job Responsibility
Job Responsibility
  • Utilize deep expertise in AWS data services while also engaging with Azure data services
  • Lead in the implementation and enforcement of data governance and security standards across cloud platforms
  • Manage Infrastructure as Code (IaC) solutions using tools like Terraform and CloudFormation
  • Spearhead optimization projects to ensure minimal disruption and maximum efficiency
  • Apply strong coding skills in Python and SQL to prototype solutions and establish coding standards
  • Develop robust solution architectures focusing on scalability, performance, security, and cost optimization
  • Design efficient data models and optimize query performance for handling large datasets
  • Oversee ETL processes and manage data integration into systems like Redshift, DuckDB, and PostgreSQL
  • Set up and manage AWS logging and tracing mechanisms using CloudTrail and X-Ray
  • Implement orchestration solutions using Apache Airflow and AWS Step Functions
What we offer
What we offer
  • Supportive environment where everyone is valued
  • Training opportunities
  • Chance for growth in Switzerland and internationally
  • Trust-based work culture based on transparency, professionalism, and commitment
  • Encouragement of innovation
  • Collective action striving to make a positive impact
  • Fulltime
Read More
Arrow Right

Data Architect

Delivery Centric is seeking a highly skilled Data Architect to design cloud-read...
Location
Location
Australia , Sydney
Salary
Salary:
Not provided
deliverycentric.com Logo
Delivery Centric Technologies
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience in data architecture, data modelling, and enterprise data platform design
  • Strong expertise in SQL, NoSQL, data warehousing, and major cloud platforms (Azure, AWS, GCP)
  • Hands-on experience with ETL/ELT tooling and big data technologies (Spark, Hadoop)
  • Experience building data pipelines and event-driven workflows
  • Certifications ideal for this role: Azure Data Engineer, AWS Developer, Databricks Data Engineer
  • Exposure to AI/ML environments and advanced analytical use cases
  • Strong analytical and problem-solving capabilities with excellent stakeholder engagement skills
Job Responsibility
Job Responsibility
  • Design scalable, secure, and high-performing data architectures aligned to business objectives
  • Develop conceptual, logical, and physical data models for enterprise data platforms
  • Drive data governance practices, ensuring compliance, quality, and security across all data assets
  • Lead integration initiatives and build reliable data pipelines across cloud and on-prem ecosystems
  • Optimize existing data platforms, improving performance, scalability, and operational efficiency
  • Collaborate with business stakeholders to translate requirements into technical solutions
  • Maintain architecture documentation, standards, data dictionaries, and solution diagrams
  • Support big data, analytics, and AI/ML initiatives through scalable data foundations
  • Fulltime
Read More
Arrow Right

Enterprise Data Architect

The Enterprise Data Architect (EDA) is responsible for defining and advancing th...
Location
Location
United States , San Mateo
Salary
Salary:
200000.00 - 275000.00 USD / Year
verkada.com Logo
Verkada
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field
  • 10+ years of experience in data architecture, data engineering, or enterprise systems integration
  • Proven track record in designing enterprise-level data architectures in complex, multi-system environments
  • Expertise in data modeling, MDM, Meta Data, and data governance frameworks
  • Experience with cloud-based data platforms (AWS, Azure, GCP) and modern data stacks (e.g., Redshift, BigQuery, Snowflake)
  • Experience with integration platforms (ETL/ELT, APIs, event streaming, middleware)
  • Knowledge of data security (tokenization, encryption, access controls) and compliance frameworks
  • Strong ability to influence cross-functional stakeholders and drive consensus across departments
  • Excellent communication skills, capable of presenting complex data concepts to both technical and non-technical audiences
  • Strategic thinker with the ability to balance long-term architecture vision with short-term delivery needs
Job Responsibility
Job Responsibility
  • Define and own the enterprise data strategy in alignment with corporate goals
  • Design and maintain comprehensive data architecture spanning ERP, HRIS, CRM, ATS, Finance, and other core business systems
  • Ensure the architecture supports scalability, performance, and resilience as the company grows
  • Lead initiatives to break down data silos and establish trusted, unified sources of truth across the organization
  • Champion a company-wide data governance framework
  • Facilitate Regular Meetings
  • Drive Cross-Functional Collaboration
  • Identify & Track Data Gaps
  • Coordinate Remediation Efforts
  • Implement Monitoring & Alerts
  • Fulltime
Read More
Arrow Right

Data Architect

As a Data and Technical Architect for Salesforce Data Cloud at Horizontal Digita...
Location
Location
India , Jaipur
Salary
Salary:
Not provided
horizontaldigital.com Logo
Horizontal Digital
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of client-facing consulting/professional services experience delivering enterprise-grade data solutions
  • 3+ years of experience implementing Salesforce Data Cloud or equivalent Customer Data Platforms (e.g., Adobe AEP, Segment, Tealium, Arm Treasure Data, BlueShift)
  • Strong background in data architecture, data modeling, ETL/ELT pipelines, data integration, and API-driven solutions
  • Certifications in Salesforce Data Cloud and a solid understanding of the Salesforce ecosystem (Sales Cloud, Service Cloud, Marketing Cloud)
  • Experience implementing data governance, data security, and regulatory compliance (e.g., GDPR, CCPA) frameworks
  • Expertise in identity resolution, subscriber management, and harmonizing data across systems to enable a single customer view
  • Demonstrated success in facilitating technical workshops, delivering solution documentation, and leading cross-functional technical teams
  • Strong analytical and problem-solving skills with expertise in agile delivery methodologies and complex solution lifecycles
  • Excellent written and verbal communication skills for engaging with technical and non-technical stakeholders alike
  • Industry experience in one or more of the following: Financial Services, Health and Life Sciences, Manufacturing, Retail, or Hospitality
Job Responsibility
Job Responsibility
  • Facilitate and lead technical discovery workshops to document detailed data architecture, integration requirements, and data ingestion strategies
  • Synthesize complex requirements to create clear and comprehensive technical solution designs and collaborate with technical teams to document and implement them
  • Assess current-state data ecosystems, contact/subscriber management, and identity resolution processes, while defining the future-state architecture and performing gap analysis across data, platform and technology
  • Lead the refinement, design, and configuration of complex data models, ensuring alignment with business processes, scalability, and Salesforce Data Cloud best practices
  • Collaborate with cross-functional data teams to design and implement data integration and migration strategies leveraging ETL tools, APIs, and middleware solutions
  • Guide and oversee the execution of user acceptance testing (UAT), ensuring the delivery of solutions that meet client expectations and quality standards
  • Serve as the primary technical point of contact for client stakeholders, providing enablement training on Salesforce Data Cloud and driving adoption across their teams
  • Advocate for data governance best practices, including data privacy, quality assurance, and regulatory compliance frameworks
  • Collaborate with Sales, Go-to-Market, and professional services teams in pre-sales activities, including scoping, solution estimation, and proposal development
  • Contribute to internal growth by developing thought leadership, building best practices, delivering training, and mentoring teams to scale Data Cloud expertise across the organization
Read More
Arrow Right
New

Senior AWS Data Engineer / Data Platform Engineer

We are seeking a highly experienced Senior AWS Data Engineer to design, build, a...
Location
Location
United Arab Emirates , Dubai
Salary
Salary:
Not provided
northbaysolutions.com Logo
NorthBay
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 8+ years of experience in data engineering and data platform development
  • Strong hands-on experience with: AWS Glue
  • Amazon EMR (Spark)
  • AWS Lambda
  • Apache Airflow (MWAA)
  • Amazon EC2
  • Amazon CloudWatch
  • Amazon Redshift
  • Amazon DynamoDB
  • AWS DataZone
Job Responsibility
Job Responsibility
  • Design, develop, and optimize scalable data pipelines using AWS native services
  • Lead the implementation of batch and near-real-time data processing solutions
  • Architect and manage data ingestion, transformation, and storage layers
  • Build and maintain ETL/ELT workflows using AWS Glue and Apache Spark on EMR
  • Orchestrate complex data workflows using Apache Airflow (MWAA)
  • Develop and manage serverless data processing using AWS Lambda
  • Design and optimize data warehouses using Amazon Redshift
  • Implement and manage NoSQL data models using Amazon DynamoDB
  • Utilize AWS DataZone for data governance, cataloging, and access management
  • Monitor, log, and troubleshoot data pipelines using Amazon CloudWatch
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.