CrawlJobs Logo

Senior Data Engineering Manager

https://www.atlassian.com Logo

Atlassian

Location Icon

Location:
United States, San Francisco

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Employment contract

Salary Icon

Salary:

168700.00 - 271100.00 USD / Year

Job Description:

Data is a big deal at Atlassian. We ingest billions of events each month into our analytics platform and we have dozens of teams across the company depending on the platform to drive their decisions. The team is also responsible to build and manage Atlassian’s operational data store which is critical in helping make decisions and guiding their operations based on the data and services we provide. We are looking for a data engineering manager to join our growing Finance Data Engineering team which is part of the Data Engineering department. Come and join our efforts to further democratize data and build exceptional data products for the company.

Job Responsibility:

  • build and lead a team of data engineers through hiring, coaching, mentoring, and hands-on career development
  • provide deep technical guidance in a number of aspects of data engineering in a scalable ecosystem
  • champion cultural and process improvements through engineering excellence, quality and efficiency
  • work with close counterparts in other departments as part of a multi-functional team, and build this culture in your team

Requirements:

  • stellar people management skills and experience in leading an agile software team
  • thrive when developing phenomenal people, not just great products
  • worked closely Data Science, analytics and platform teams
  • expertise in building and maintaining high-quality components and services
  • able to drive technical excellence, pushing for innovation and quality
  • at least 10 years experience in a software development role as an individual contributor
  • 4+ years of people management experience
  • deep understanding of data challenges at scale challenges and the eco-system
  • experience with solution building and architecting with public cloud offerings such as Amazon Web Services, DynamoDB, ElasticSearch, S3, Databricks, Spark/Spark-Streaming, GraphDatabases
  • experience with Enterprise Data architectural standard methodologies
  • experience with test automation and continuous delivery
  • a graduate degree in Computer Science or similar discipline

Nice to have:

  • experienced with Financial data in an enterprise setting
  • experience with at least one high-level programming language such as Python
  • understanding and experience building RESTful APIs and microservices
  • experience with Tableau and Tableau administration
  • experience with Machine Learning
  • committed code to open source projects and have some examples
What we offer:
  • health coverage
  • paid volunteer days
  • wellness resources

Additional Information:

Job Posted:
March 19, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineering Manager

New

Senior Backend Engineer / Tech Lead (Data Management)

As a Senior Backend Software Engineer at Aignostics, you work hand in hand with ...
Location
Location
Germany , Berlin
Salary
Salary:
Not provided
Aignostics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor and/or Master in a relevant field or extensive work experience
  • 6+ years of software development experience in a data intensive environment
  • Experience leading a technical initiative ideally with cross-team impact
  • Strong background in software development ideally with Python
  • Experience with cloud providers (GCP, AWS) and their services
  • Experience with container orchestration (preferable Kubernetes)
  • Experience with database systems
  • Familiar with CI/CD pipelines, code reviews and other standards to keep up code quality
  • Driven self-starter, well-organized, excellent communication skills and a strong team player
Job Responsibility
Job Responsibility
  • Design and develop services and core libraries that enable our SaaS platform
  • Ensure reliable, high throughput access to our data for machine learning
  • Maintain and expand our data management infrastructure
  • Lead initiatives, evaluate new technologies and their integration into our current codebase
  • Eagerness to take ownership - from inception to completion - without losing focus on the business context
  • Communicate closely with our frontend and machine learning teams
  • Perform code reviews, considering readability, design and performance
What we offer
What we offer
  • Learning & Development yearly budget of 1,000€ (plus 2 L&D days)
  • Language classes and internal development programs
  • Mentoring program
  • Flexible working hours and teleworking policy
  • 30 paid vacations days per year
  • Family & pet friendly and support flexible parental leave options
  • Subsidized membership of your choice among public transport, sports and well-being
  • Social gatherings, lunches, and off-site events
  • Optional company pension scheme
Read More
Arrow Right
New

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
84835.61 - 149076.17 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field.
  • Two (2) plus years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows, BQML, Vertex AI.
  • Six (6) plus years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics.
  • Hands-on experience working with real-time, unstructured, and synthetic data.
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar.
  • Expert knowledge on Python programming and SQL.
  • Experience with cloud platforms (AWS, GCP, Azure) and their data services
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed.
  • Familiarity with synthetic data generation and unstructured data processing
Job Responsibility
Job Responsibility
  • Architect, develop, and optimize scalable data pipelines handling real-time, unstructured, and synthetic datasets
  • Collaborate with cross-functional teams, including data scientists, analysts, and product owners, to deliver innovative data solutions that drive business growth.
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound.
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform.
  • Leverage streaming technologies and cloud platforms to enable real-time data processing and analytics
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products.
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root.
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Eligible to participate in an annual incentive program
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Adtalem is a data driven organization. The Data Engineering team builds data sol...
Location
Location
United States , Lisle
Salary
Salary:
85000.00 - 150000.00 USD / Year
adtalem.com Logo
Adtalem Global Education
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree Computer Science, Computer Engineering, Software Engineering, or another related technical field Required
  • Master's Degree Computer Science, Computer Engineering, Software Engineering, or other related technical field Preferred
  • 2+ years experience in Google cloud with services like BigQuery, Composer, GCS, DataStream, Dataflows Required
  • 6+ years experience in data engineering solutions such as data platforms, ingestion, data management, or publication/analytics Required
  • Expert knowledge on SQL and Python programming
  • Experience working with Airflow as workflow management tools and build operators to connect, extract and ingest data as needed
  • Experience in tuning queries for performance and scalability
  • Experience in Real Time Data ingestion using GCP PubSub, Kafka, Spark or similar
  • Excellent organizational, prioritization and analytical abilities
  • Have proven experience working in incremental execution through successful launches
Job Responsibility
Job Responsibility
  • Work closely with various business, IT, Analyst and Data Science groups to collect business requirements
  • Design, develop, deploy and support high performance data pipelines both inbound and outbound
  • Model data platform by applying the business logic and building objects in the semantic layer of the data platform
  • Optimize data pipelines for performance, scalability, and reliability
  • Implement CI/CD pipelines to ensure continuous deployment and delivery of our data products
  • Ensure quality of critical data elements, prepare data quality remediation plans and collaborate with business and system owners to fix the quality issues at its root
  • Document the design and support strategy of the data pipelines
  • Capture, store and socialize data lineage and operational metadata
  • Troubleshoot and resolve data engineering issues as they arise
  • Develop REST APIs to expose data to other teams within the company
What we offer
What we offer
  • Health, dental, vision, life and disability insurance
  • 401k Retirement Program + 6% employer match
  • Participation in Adtalem’s Flexible Time Off (FTO) Policy
  • 12 Paid Holidays
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

For this role, we are seeking a Senior Data Engineer for our Client's ETL Suppor...
Location
Location
India
Salary
Salary:
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • In-depth knowledge of AWS Glue, AWS Lambda, and AWS Step Functions
  • A deep understanding of ETL processes and data warehouse design
  • Proven ability to troubleshoot data pipelines and perform root cause analysis (RCA)
  • 3-5 years of relevant experience
  • Hands-on experience with Glue, Lambda, and Step Function development
  • Must be able to work a day shift that includes coverage for weekends and holidays on a rotational basis
Job Responsibility
Job Responsibility
  • Monitor approximately 2,300 scheduled jobs (daily, weekly, monthly) to ensure timely and successful execution
  • Execute on-demand jobs as required by the business
  • Troubleshoot job failures, perform detailed root cause analysis (RCA), and provide clear documentation for all findings
  • Address and resolve bugs and data-related issues reported by the business team
  • Verify source file placement in designated directories to maintain data integrity
  • Reload Change Data Capture (CDC) tables when structural changes occur in source systems
  • Help manage synchronization between external databases (including Teradata write-backs) and AWS Glue tables
  • Assist in developing new solutions, enhancements, and bug fixes using AWS Glue, Lambda, and Step Functions
  • Answer questions from the business and support User Acceptance Testing (UAT) inquiries
  • Make timely decisions to resolve issues, execute tasks efficiently, and escalate complex problems to senior or lead engineers as needed, all while maintaining agreed-upon SLAs
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

As a Senior Data Engineer, you will be pivotal in designing, building, and optim...
Location
Location
United States
Salary
Salary:
102000.00 - 125000.00 USD / Year
wpromote.com Logo
Wpromote
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent practical experience
  • 4+ years of experience in data engineering or a related field
  • Intermediate to advanced programming skills in Python
  • Proficiency in SQL and experience with relational databases
  • Strong knowledge of database and data warehousing design and management
  • Strong experience with DBT (data build tool) and test-driven development practices
  • Proficiency with at least 1 cloud database (e.g. BigQuery, Snowflake, Redshift, etc.)
  • Excellent problem-solving skills, project management habits, and attention to detail
  • Advanced level Excel and Google Sheets experience
  • Familiarity with data orchestration tools (e.g. Airflow, Dagster, AWS Glue, Azure data factory, etc.)
Job Responsibility
Job Responsibility
  • Developing data pipelines leveraging a variety of technologies including dbt and BigQuery
  • Gathering requirements from non-technical stakeholders and building effective solutions
  • Identifying areas of innovation that align with existing company and team objectives
  • Managing multiple pipelines across Wpromote’s client portfolio
What we offer
What we offer
  • Half-day Fridays year round
  • Unlimited PTO
  • Extended Holiday break (Winter)
  • Flexible schedules
  • Work from anywhere options*
  • 100% paid parental leave
  • 401(k) matching
  • Medical, Dental, Vision, Life, Pet Insurance
  • Sponsored life insurance
  • Short Term Disability insurance and additional voluntary insurance
  • Fulltime
Read More
Arrow Right
New

Senior Product Manager (data & analytics)

Userlane is a market-leading Digital Adoption Platform that empowers organisatio...
Location
Location
Germany , Munich
Salary
Salary:
Not provided
userlane.com Logo
Userlane GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of product management experience on data/analytics products within a modern product organisation
  • Experience building analytics for executives or senior leaders
  • Strong opinions on data visualisation for time-poor executives
  • Experience in enterprise B2B (not B2C product analytics)
  • Healthcare experience is a strong plus
  • Capable of explaining things to engineers, salespeople, support agents and CEOs alike
  • Fluent in English (spoken and written)
Job Responsibility
Job Responsibility
  • Own Application Intelligence at Userlane
  • Shape what we build across App Discovery, HEART Analytics and Portfolio Overview
  • Deliver outcome-linked metrics
  • Build for time-poor executives
  • Scale portfolio intelligence
  • Connect fragmented data sources
  • Transform insights into action
  • Govern AI adoption
What we offer
What we offer
  • High-performance culture with great leadership and a fun, engaged, motivated and diverse team
  • Part of an empowered cross-functional team
  • Work directly with our VP Product to shape our 2026-2030 strategy
  • Userlane is among the global leaders in the rapidly growing Digital Adoption industry
  • Weekly 121s
  • Personalised skills assessment and development plan
  • On-the-job coaching
  • Budget for events and training
  • Fulltime
Read More
Arrow Right
New

Senior Crypto Data Engineer

Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate...
Location
Location
Vietnam , Hanoi
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • A Master's degree in a relevant field is an added advantage
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
Job Responsibility
Job Responsibility
  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer - Platform Enablement

SoundCloud empowers artists and fans to connect and share through music. Founded...
Location
Location
United States , New York; Atlanta; East Coast
Salary
Salary:
160000.00 - 210000.00 USD / Year
soundcloud.com Logo
SoundCloud
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 7+ years of experience in data engineering, analytics engineering, or similar roles
  • Expert-level SQL skills, including performance tuning, advanced joins, CTEs, window functions, and analytical query design
  • Proven experience with Apache Airflow (designing DAGs, scheduling, task dependencies, monitoring, Python)
  • Familiarity with event-driven architectures and messaging systems (Pub/Sub, Kafka, etc.)
  • Knowledge of data governance, schema management, and versioning best practices
  • Understanding observability practices: logging, metrics, tracing, and incident response
  • Experience deploying and managing services in cloud environments, preferably GCP, AWS
  • Excellent communication skills and a collaborative mindset
Job Responsibility
Job Responsibility
  • Develop and optimize SQL data models and queries for analytics, reporting, and operational use cases
  • Design and maintain ETL/ELT workflows using Apache Airflow, ensuring reliability, scalability, and data integrity
  • Collaborate with analysts and business teams to translate data needs into efficient, automated data pipelines and datasets
  • Own and enhance data quality and validation processes, ensuring accuracy and completeness of business-critical metrics
  • Build and maintain reporting layers, supporting dashboards and analytics tools (e.g. Looker, or similar)
  • Troubleshoot and tune SQL performance, optimizing queries and data structures for speed and scalability
  • Contribute to data architecture decisions, including schema design, partitioning strategies, and workflow scheduling
  • Mentor junior engineers, advocate for best practices and promote a positive team culture
What we offer
What we offer
  • Comprehensive health benefits including medical, dental, and vision plans, as well as mental health resources
  • Robust 401k program
  • Employee Equity Plan
  • Generous professional development allowance
  • Creativity and Wellness benefit
  • Flexible vacation and public holiday policy where you can take up to 35 days of PTO annually
  • 16 paid weeks for all parents (birthing and non-birthing), regardless of gender, to welcome newborns, adopted and foster children
  • Various snacks, goodies, and 2 free lunches weekly when at the office
  • Fulltime
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.