CrawlJobs Logo

Senior Data Scientist, Platform (Integrity)

airbnb.com Logo

Airbnb

Location Icon

Location:
United States

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

177000.00 - 208000.00 USD / Year

Job Description:

This role directly improves the safety, trust, and quality of real-world user experiences by advancing Airbnb’s ability to understand, interpret and act on content at scale. You will help shape how the platform reasons about listings, profiles, messages and other user-generated content by building the next generation of Trust Content Understanding Models.

Job Responsibility:

  • Advance Airbnb’s content integrity capabilities by building Natural Language Processing (NLP) and LLM-based models that understand intent, policy compliance, quality and risk across listings, profiles, and user communications
  • Develop high-performing models for detecting problematic or misleading content, including text classification, semantic similarity, information extraction and generative model-based reasoning for policy interpretation and enforcement
  • Design and optimize human-in-the-loop Machine Learning (ML) systems for content review, labeling, escalation and continuous model improvement
  • Build systems to detect emerging content risks and abuse patterns across regions, cohorts and surfaces using statistical, ML and representation-learning approaches
  • Design intelligent sampling and evaluation strategies to measure rare events, policy recall, false positives/negatives and model blind spots in large-scale content systems
  • Build and deploy production AI/ML systems for content integrity and trust content understanding, including feature engineering, model development and evaluation, thresholding, error analysis and end-to-end model lifecycle management
  • Partner with inference data scientists to conduct rigorous quantitative analyses, applying working knowledge of causal inference to interpret results, assess impact, and identify gaps and opportunities to improve content quality and trust outcomes
  • Develop frameworks to analyze tradeoffs between enforcement accuracy, user experience, operational cost and coverage, and propose strategies to optimize overall system effectiveness
  • Deliver robust research reports with effective data visualizations, clear storytelling and bullet-proof accuracy to drive forward impact in collaboration with cross-functional partners in product, engineering and operations
  • Think strategically about how to scale and evolve Airbnb’s content integrity defenses, helping define the long-term vision for the role of AI-driven content understanding across the Trust ecosystem

Requirements:

  • 5+ years of industry experience in a quantitative analysis role with a Master’s degree in a quantitative field (computer science, statistics etc.), or 2+ years of experience with a Ph.D.
  • State-of-the-art knowledge of AI/ML models
  • Hands-on experience building, evaluating, and deploying NLP and LLM-based solutions, including text classification, information extraction, semantic understanding or generative applications
  • Working knowledge of causal inference
  • Skilled in statistical programming (Python or R) and database usage (SQL)
  • Proven ability to communicate clearly and effectively to audiences of varying technical levels
  • Ability to translate complex findings and results into compelling narratives that drive impact
  • Excellent project management, communication, and collaboration skills

Nice to have:

Trust & Safety experience is a plus

What we offer:
  • bonus
  • equity
  • benefits
  • Employee Travel Credits

Additional Information:

Job Posted:
February 14, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Scientist, Platform (Integrity)

Senior Data Scientist – Audience Data Product

We are looking for a Senior Data Scientist – Audience Data Product to join our G...
Location
Location
United States , Bethesda, MD
Salary
Salary:
52.06 - 82.45 USD / Hour
https://www.marriott.com Logo
Marriott Bonvoy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4-year degree from an accredited university in Data Analytics, Computer Science, Engineering, Information Systems, or a similar quantitative discipline, with 4+ years of experience demonstrating progressive career growth and a history of exceptional performance in data science or analytics OR 2+ years of experience in data science or analytics with a Master's degree
  • Proven experience in building and optimizing audience segmentation and lookalike models for targeted advertising using data science techniques, including clustering, classification, and predictive modeling
  • Strong experience with writing AI/ML models in Python, Spark or SQL, and familiarity with cloud data warehouses (e.g., Snowflake, AWS) for data processing and model deployment
  • Demonstrated ability to use data science methodologies (e.g., machine learning, statistical modeling) to analyze and optimize large-scale datasets for audience targeting
  • Ability to collaborate cross-functionally with marketing, digital, and technology teams to align on broader data strategies and contribute insights to media optimization and performance measurement
Job Responsibility
Job Responsibility
  • Design and develop audience segmentation and various lookalike models to optimize targeting across owned & paid channels
  • Work with cross-functional teams to collect, analyze, and optimize media exposure data for continuous media optimization and audience measurement
  • Collaborate closely with teams from marketing, digital, and technology to evolve the media data ecosystem, leveraging advanced tools such as cloud data warehouses (Snowflake), data cleanrooms, and customer data platforms (CDPs)
  • Develop and implement segmentation and modeling strategies
  • Collaborate with marketing and analytics teams to design and build segmentation models and lookalike models for optimized targeting across paid media channels
  • Work with the AdTech and IT teams to ensure effective integration and deployment of data infrastructure that supports segmentation and audience activation efforts
  • Align segmentation models and tactics with the broader organizational data strategy to ensure consistency with long-term business goals and objectives
  • Collaborate closely with marketing and analytics teams to define key performance indicators (KPIs) and data requirements for optimizing segmentation strategies, improving campaign targeting, and driving ROI
  • Collaborate with technical teams and vendors (e.g., AWS, Snowflake) to build and enhance infrastructure that supports data-driven segmentation, model development, and performance measurement
  • Identify and work on improving processes that enhance data quality and reduce manual interventions in segmentation model development and activation workflows
What we offer
What we offer
  • Coverage for medical, dental, vision, health care flexible spending account, dependent care flexible spending account, life insurance, disability insurance, accident insurance, adoption expense reimbursements, paid parental leave, 401(k) plan, stock purchase plan, discounts at Marriott properties, commuter benefits, employee assistance plan, and childcare discounts
  • Fulltime
Read More
Arrow Right

Senior Data Scientist

Location
Location
India , Bengaluru
Salary
Salary:
Not provided
adm.com Logo
ADM
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Science, Machine Learning, Computer Science, Statistics, or a related field. Master’s degree or Ph.D. is a plus
  • 7+ years of experience in data science, machine learning, or AI, with demonstrated success in building models that drive business outcomes
  • Proficient in Python, R, and SQL for data analysis, modeling, and data pipeline development
  • Experience with DevSecOps practices, and tools such as GitHub, Azure DevOps, Terraform, Bicep, AquaSec etc
  • Experience with cloud platforms (Azure, AWS, Google Cloud) and large-scale data processing tools (e.g., Hadoop, Spark)
  • Strong understanding of both supervised and unsupervised learning models and techniques
  • Experience with frameworks like TensorFlow, PyTorch, and working knowledge of Generative AI models like GPT and GANs
  • Hands-on experience with Generative AI techniques, but with a balanced approach to leveraging them where they can add value
  • Proven experience in rapid prototyping and ability to iterate quickly to meet business needs in a dynamic environment
Job Responsibility
Job Responsibility
  • Lead end-to-end machine learning projects, from data exploration, modeling, and deployment, ensuring alignment with business objectives
  • Utilize traditional AI/data science methods (e.g., regression, classification, clustering) and advanced AI methods (e.g., neural networks, NLP) to address business problems and optimize processes
  • Implement and experiment with Generative AI models based on business needs using Prompt Engineering, Retrieval Augmented Generation (RAG) or Finetuning, using LLM's, LVM's, TTS etc
  • Collaborate with teams across Digital & Innovation, business stakeholders, software engineers, and product teams, to rapidly prototype and iterate on new models and solutions
  • Mentor and coach junior data scientists and analysts, fostering an environment of continuous learning and collaboration
  • Adapt quickly to new AI advancements and technologies, continuously learning and applying emerging methodologies to solve complex problems
  • Work closely with other teams (e.g., Cybersecurity, Cloud Engineering) to ensure the successful integration of models into production systems
  • Ensure models meet rigorous performance, accuracy, and efficiency standards, performing cross-validation, tuning, and statistical checks
  • Communicate results and insights effectively to both technical and non-technical stakeholders, delivering clear recommendations for business impact
  • Ensure adherence to data privacy, security policies, and governance standards across all data science initiatives
Read More
Arrow Right

Senior Data Scientist, Marketplace

You will join a team dedicated to driving innovation and optimizing our dynamic ...
Location
Location
United States
Salary
Salary:
177000.00 - 208000.00 USD / Year
airbnb.com Logo
Airbnb
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of relevant industry experience, or 2+ years with a PhD
  • Background in economics, statistics, and experimentation
  • A PhD or Master’s in economics or a related field is valued
  • Strong fluency in Python or R for hands-on IC work and SQL for advanced data analysis at scale
  • Experience with causal inference and machine learning techniques, ideally in a multi-sided platform setting
  • Proven ability to succeed in collaborative environments with cross-functional stakeholders and also in independent work environments
  • Proven ability to communicate clearly and effectively to audiences of varying technical levels
Job Responsibility
Job Responsibility
  • Specifying and estimating models to determine the impact of supply and demand on marketplace outcomes
  • Delivering data and insights to enable the company’s supply growth efforts
  • Crafting models to accurately forecast key marketplace metrics, enabling proactive decision-making and strategic planning
  • Advanced causal inference: Utilizing cutting-edge causal inference techniques, you will quantify the impact of changes in supply and demand on the marketplace, informing targeted improvements and interventions
  • Cross-functional collaboration: You will actively collaborate with teams across product, engineering, and operations to integrate data science insights into initiatives aimed at enhancing supply acquisition, retention, and success
  • Stakeholder engagement: You will engage with stakeholders to understand their business objectives and provide data-driven solutions tailored to these goals
What we offer
What we offer
  • bonus
  • equity
  • benefits
  • Employee Travel Credits
  • Fulltime
Read More
Arrow Right

Senior Data Engineering Architect

Location
Location
Poland
Salary
Salary:
Not provided
lingarogroup.com Logo
Lingaro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven work experience as a Data Engineering Architect or a similar role and strong experience in in the Data & Analytics area
  • Strong understanding of data engineering concepts, including data modeling, ETL processes, data pipelines, and data governance
  • Expertise in designing and implementing scalable and efficient data processing frameworks
  • In-depth knowledge of various data technologies and tools, such as relational databases, NoSQL databases, data lakes, data warehouses, and big data frameworks (e.g., Hadoop, Spark)
  • Experience in selecting and integrating appropriate technologies to meet business requirements and long-term data strategy
  • Ability to work closely with stakeholders to understand business needs and translate them into data engineering solutions
  • Strong analytical and problem-solving skills, with the ability to identify and address complex data engineering challenges
  • Proficiency in Python, PySpark, SQL
  • Familiarity with cloud platforms and services, such as AWS, GCP, or Azure, and experience in designing and implementing data solutions in a cloud environment
  • Knowledge of data governance principles and best practices, including data privacy and security regulations
Job Responsibility
Job Responsibility
  • Collaborate with stakeholders to understand business requirements and translate them into data engineering solutions
  • Design and oversee the overall data architecture and infrastructure, ensuring scalability, performance, security, maintainability, and adherence to industry best practices
  • Define data models and data schemas to meet business needs, considering factors such as data volume, velocity, variety, and veracity
  • Select and integrate appropriate data technologies and tools, such as databases, data lakes, data warehouses, and big data frameworks, to support data processing and analysis
  • Create scalable and efficient data processing frameworks, including ETL (Extract, Transform, Load) processes, data pipelines, and data integration solutions
  • Ensure that data engineering solutions align with the organization's long-term data strategy and goals
  • Evaluate and recommend data governance strategies and practices, including data privacy, security, and compliance measures
  • Collaborate with data scientists, analysts, and other stakeholders to define data requirements and enable effective data analysis and reporting
  • Provide technical guidance and expertise to data engineering teams, promoting best practices and ensuring high-quality deliverables. Support to team throughout the implementation process, answering questions and addressing issues as they arise
  • Oversee the implementation of the solution, ensuring that it is implemented according to the design documents and technical specifications
What we offer
What we offer
  • Stable employment. On the market since 2008, 1500+ talents currently on board in 7 global sites
  • Workation. Enjoy working from inspiring locations in line with our workation policy
  • Great Place to Work® certified employer
  • Flexibility regarding working hours and your preferred form of contract
  • Comprehensive online onboarding program with a “Buddy” from day 1
  • Cooperation with top-tier engineers and experts
  • Unlimited access to the Udemy learning platform from day 1
  • Certificate training programs. Lingarians earn 500+ technology certificates yearly
  • Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly
  • Grow as we grow as a company. 76% of our managers are internal promotions
Read More
Arrow Right

Senior Data Engineer

As a Senior Software Engineer, you will play a key role in designing and buildin...
Location
Location
United States
Salary
Salary:
156000.00 - 195000.00 USD / Year
apollo.io Logo
Apollo.io
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years experience in platform engineering, data engineering or in a data facing role
  • Experience in building data applications
  • Deep knowledge of data eco system with an ability to collaborate cross-functionally
  • Bachelor's degree in a quantitative field (Physical / Computer Science, Engineering or Mathematics / Statistics)
  • Excellent communication skills
  • Self-motivated and self-directed
  • Inquisitive, able to ask questions and dig deeper
  • Organized, diligent, and great attention to detail
  • Acts with the utmost integrity
  • Genuinely curious and open
Job Responsibility
Job Responsibility
  • Architect and build robust, scalable data pipelines (batch and streaming) to support a variety of internal and external use cases
  • Develop and maintain high-performance APIs using FastAPI to expose data services and automate data workflows
  • Design and manage cloud-based data infrastructure, optimizing for cost, performance, and reliability
  • Collaborate closely with software engineers, data scientists, analysts, and product teams to translate requirements into engineering solutions
  • Monitor and ensure the health, quality, and reliability of data flows and platform services
  • Implement observability and alerting for data services and APIs (think logs, metrics, dashboards)
  • Continuously evaluate and integrate new tools and technologies to improve platform capabilities
  • Contribute to architectural discussions, code reviews, and cross-functional projects
  • Document your work, champion best practices, and help level up the team through knowledge sharing
What we offer
What we offer
  • Equity
  • Company bonus or sales commissions/bonuses
  • 401(k) plan
  • At least 10 paid holidays per year
  • Flex PTO
  • Parental leave
  • Employee assistance program and wellbeing benefits
  • Global travel coverage
  • Life/AD&D/STD/LTD insurance
  • FSA/HSA and medical, dental, and vision benefits
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

At Relatient, we’re on a mission to simplify access to care – intelligently. As ...
Location
Location
India , Pune
Salary
Salary:
Not provided
relatient.com Logo
Relatient
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree, B.E./ B. Tech, computer engineering, or equivalent work experience in lieu of a degree is required, Master’s degree preferred
  • 7+ years of experience in database engineering, data warehousing, or data architecture
  • Proven expertise with at least one major data warehouse platform (e.g. Postgres, Snowflake, Redshift, BigQuery)
  • Strong SQL and ETL/ELT development skills
  • Deep understanding of data modeling
  • Experience with cloud data ecosystems (AWS)
  • Hands-on experience with orchestration tools and version control (Git)
  • Experience in data governance, security, and compliance best practices
  • Experience building/generating analytical reports using Power BI
Job Responsibility
Job Responsibility
  • Architect, design, and implement robust end-to-end data warehouse (DW) solutions using modern technologies (e.g. Postgres or on-prem solutions)
  • Define data modeling standards (dimensional and normalized) and build ETL/ELT pipelines for efficient data flow and transformation
  • Integrate data from multiple sources (ERP, CRM. APIs, flat files, real-time streams)
  • Develop and maintain scalable and reliable data ingestion, transformation, and storage pipelines
  • Ensure data quality, consistency, and lineage across all data systems
  • Analyst and tune SQL queries, schemas, indexes, and ETL process to maximize database and warehouse performance
  • Monitor data systems and optimize storage costs and query response times
  • Implement high availability, backup, disaster recovery, and data security strategies
  • Collaborate with DevOps and Infrastructure teams to ensure optimal deployment, scaling, and performance of DW environments
  • Work closely with Data Scientists, Analysts, and Business Teams to translate business needs into technical data solutions
What we offer
What we offer
  • INR 5,00,000/- of life insurance coverage for all full-time employees and their immediate family
  • INR 15,00,000/- of group accident insurance
  • Education reimbursement
  • 10 national and state holidays, plus 1 floating holiday
  • Flexible working hours and a hybrid policy
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Location
Location
United States , Flowood
Salary
Salary:
Not provided
phasorsoft.com Logo
PhasorSoft Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Experience with Snowflake or Azure Cloud Data Engineering, including setting up and managing data pipelines
  • Proficiency in designing and implementing ETL processes for data integration
  • Knowledge of data warehousing concepts and best practices
  • Strong SQL skills for querying and manipulating data in Snowflake or Azure databases
  • Experience with data modeling techniques and tools to design efficient data structures
  • Understanding of data governance principles and experience implementing them in cloud environments
  • Proficiency in Tableau or Power BI for creating visualizations and interactive dashboards
  • Ability to write scripts (e.g., Python, PowerShell) for automation and orchestration of data pipelines
  • Skills to monitor and optimize data pipelines for performance and cost efficiency
  • Knowledge of cloud data security practices and tools to ensure data protection
Job Responsibility
Job Responsibility
  • Design, implement, and maintain data pipelines and architectures on Snowflake or Azure Cloud platforms
  • Develop ETL processes to extract, transform, and load data from various sources into data warehouses
  • Optimize data storage, retrieval, and processing for performance and cost-efficiency in cloud environments
  • Collaborate with stakeholders to understand data requirements and translate them into technical solutions
  • Implement data security and governance best practices to ensure data integrity and compliance
  • Work with reporting tools such as Tableau or Power BI to create interactive dashboards and visualizations
  • Monitor and troubleshoot data pipelines, ensuring reliability and scalability
  • Automate data workflows and processes using cloud-native services and scripting languages
  • Provide technical expertise and support to data analysts, scientists, and business users
  • Fulltime
Read More
Arrow Right

Senior Data Engineer

Our Senior Data Engineers enable public sector organisations to embrace a data-d...
Location
Location
United Kingdom , Bristol; London; Manchester; Swansea
Salary
Salary:
60000.00 - 80000.00 GBP / Year
madetech.com Logo
Made Tech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Enthusiasm for learning and self-development
  • Proficiency in Git (inc. Github Actions) and able to explain the benefits of different branch strategies
  • Gathering and meeting the requirements of both clients and users on a data project
  • Strong experience in IaC and able to guide how one could deploy infrastructure into different environments
  • Owning the cloud infrastructure underpinning data systems through a DevOps approach
  • Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop
  • Good understanding of the possible architectures involved in modern data system design (e.g. Data Warehouse, Data Lakes and Data Meshes) and the different use cases for them
  • Ability to create data pipelines on a cloud environment and integrate error handling within these pipelines. With an understanding how to create reusable libraries to encourage uniformity of approach across multiple data pipelines.
  • Able to document and present an end-to-end diagram to explain a data processing system on a cloud environment, with some knowledge of how you would present diagrams (C4, UML etc.)
  • To provide guidance how one would implement a robust DevOps approach in a data project. Also would be able to talk about tools needed for DataOps in areas such as orchestration, data integration and data analytics.
Job Responsibility
Job Responsibility
  • Enable public sector organisations to embrace a data-driven approach by providing data platforms and services that are high-quality, cost-efficient, and tailored to clients’ needs
  • Develop, operate, and maintain these services
  • Provide maximum value to data consumers, including analysts, scientists, and business stakeholders
  • Play one or more roles according to our clients' needs
  • Support as a senior contributor for a project, focusing on both delivering engineering work as well as upskilling members of the client team
  • Play more of a technical architect role and work with the larger MadeTech team to identify growth opportunities within the account
  • Have a drive to deliver outcomes for users
  • Make sure that the wider context of a delivery is considered and maintain alignment between the operational and analytical aspects of the engineering solution
What we offer
What we offer
  • 30 days of paid annual leave + bank holidays
  • Flexible Parental Leave
  • Part time remote working for all our staff
  • Paid counselling as well as financial and legal advice
  • Flexible benefit platform which includes a Smart Tech scheme, Cycle to work scheme, and an individual benefits allowance which you can invest in a Health care cash plan or Pension plan
  • Optional social and wellbeing calendar of events
  • Fulltime
Read More
Arrow Right