CrawlJobs Logo

Senior Data Engineer

https://www.atlassian.com Logo

Atlassian

Location Icon

Location:
India, Bengaluru

Category Icon
Category:
IT - Software Development

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

Atlassian is looking for a Senior Data Engineer to join our Go-To Market Data Engineering (GTM-DE) team which is responsible for building our data lake, maintaining our big data pipelines / services and facilitating the movement of billions of messages each day. We work directly with business teams and plenty of platform and engineering teams to ensure growth and retention strategies at Atlassian. We do this by providing metrics and other data elements which are reliable and trustworthy, as well as services and data products to help teams better self serve and improve their time to reliable insights. We are looking for an open-minded, structured thinker who is passionate about building services that scale.

Job Responsibility:

  • Help our partner teams ingest data faster into our data lake
  • Make our data products more efficient
  • Build self-serve data engineering within the company
  • Build micro-services, architecting, designing, and promoting self serve capabilities at scale

Requirements:

  • At least 7+ years of professional experience as a software engineer or data engineer
  • A BS in Computer Science or equivalent experience
  • Strong programming skills (some combination of Python, Java, and Scala)
  • Experience writing SQL, structuring data, and data storage practices
  • Experience with data modeling
  • Knowledge of data warehousing concepts
  • Experienced building data pipelines and micro services
  • Experience with Spark, Airflow and other streaming technologies to process incredible volumes of streaming data
  • A willingness to accept failure, learn and try again
  • An open mind to try solutions that may seem impossible at first
  • Experience working on Amazon Web Services (in particular using EMR, Kinesis, RDS, S3, SQS and the like)

Nice to have:

  • Experience building self-service tooling and platforms
  • Built and designed Kappa architecture platforms
  • A passion for building and running continuous integration pipelines.
  • Built pipelines using Databricks and well versed with their API’s
  • Contributed to open source projects (Ex: Operators in Airflow)
What we offer:
  • Health coverage
  • Paid volunteer days
  • Wellness resources

Additional Information:

Job Posted:
March 19, 2025

Employment Type:
Fulltime
Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Senior Data Engineer

New

Senior Data Engineer

We are looking for a highly skilled Senior Data Engineer to lead the design and ...
Location
Location
United Kingdom
Salary
Salary:
45000.00 - 60000.00 GBP / Year
activate-group.com Logo
Activate Group Limited
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Proven experience as a Senior Data Engineer, BI/Data Warehouse Engineer, or similar
  • Strong hands-on expertise with Microsoft Fabric and related services
  • End-to-end DWH development experience, from ingestion to modelling and consumption
  • Strong background in data modelling, including star schema, dimensional modelling and semantic modelling
  • Experience with orchestration, monitoring and optimisation of data pipelines
  • Proficiency in SQL and strong understanding of database principles
  • Ability to design scalable data architectures aligned to business needs
Job Responsibility
Job Responsibility
  • Lead the design, architecture and build of a new enterprise data warehouse on Microsoft Fabric
  • Develop robust data pipelines, orchestration processes and monitoring frameworks using Fabric components (Data Factory, Data Engineering, Lakehouse)
  • Create scalable and high-quality data models to support analytics, Power BI reporting and self-service data consumption
  • Establish and enforce data governance, documentation and best practices across the data ecosystem
  • Collaborate with cross-functional teams to understand data needs and translate them into technical solutions
  • Provide technical leadership, mentoring and guidance to junior team members where required
What we offer
What we offer
  • 33 days holiday (including bank holidays)
  • Personal health cash plan – claim back the cost of things like dentist and optical check ups
  • Enhanced maternity / paternity / adoption / shared parental pay
  • Life assurance: three times basic salary
  • Free breakfasts and fruit
  • Birthday surprise for everybody
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

For this role, we are seeking a Senior Data Engineer for our Client's ETL Suppor...
Location
Location
India
Salary
Salary:
Not provided
3pillarglobal.com Logo
3Pillar Global
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • In-depth knowledge of AWS Glue, AWS Lambda, and AWS Step Functions
  • A deep understanding of ETL processes and data warehouse design
  • Proven ability to troubleshoot data pipelines and perform root cause analysis (RCA)
  • 3-5 years of relevant experience
  • Hands-on experience with Glue, Lambda, and Step Function development
  • Must be able to work a day shift that includes coverage for weekends and holidays on a rotational basis
Job Responsibility
Job Responsibility
  • Monitor approximately 2,300 scheduled jobs (daily, weekly, monthly) to ensure timely and successful execution
  • Execute on-demand jobs as required by the business
  • Troubleshoot job failures, perform detailed root cause analysis (RCA), and provide clear documentation for all findings
  • Address and resolve bugs and data-related issues reported by the business team
  • Verify source file placement in designated directories to maintain data integrity
  • Reload Change Data Capture (CDC) tables when structural changes occur in source systems
  • Help manage synchronization between external databases (including Teradata write-backs) and AWS Glue tables
  • Assist in developing new solutions, enhancements, and bug fixes using AWS Glue, Lambda, and Step Functions
  • Answer questions from the business and support User Acceptance Testing (UAT) inquiries
  • Make timely decisions to resolve issues, execute tasks efficiently, and escalate complex problems to senior or lead engineers as needed, all while maintaining agreed-upon SLAs
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

Our client is a global jewelry manufacturer undergoing a major transformation, m...
Location
Location
Poland , Wroclaw
Salary
Salary:
Not provided
zoolatech.com Logo
Zoolatech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with proven expertise in Azure Synapse Analytics and SQL Server
  • Advanced proficiency in SQL, covering relational databases, data warehousing, dimensional modeling, and cubes
  • Practical experience with Azure Data Factory, Databricks, and PySpark
  • Track record of designing, building, and delivering production-ready data products at enterprise scale
  • Strong analytical skills and ability to translate business requirements into technical solutions
  • Excellent communication skills in English, with the ability to adapt technical details for different audiences
  • Experience working in Agile/Scrum teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable, efficient, and reusable data pipelines and products on the Azure PaaS data platform
  • Collaborate with product owners, architects, and business stakeholders to translate requirements into technical designs and data models
  • Enable advanced analytics, reporting, and other data-driven use cases that support commercial initiatives and operational efficiencies
  • Ingest, transform, and optimize large, complex data sets while ensuring data quality, reliability, and performance
  • Apply DevOps practices, CI/CD pipelines, and coding best practices to ensure robust, production-ready solutions
  • Monitor and own the stability of delivered data products, ensuring continuous improvements and measurable business benefits
  • Promote a “build-once, consume-many” approach to maximize reuse and value creation across business verticals
  • Contribute to a culture of innovation by following best practices while exploring new ways to push the boundaries of data engineering
What we offer
What we offer
  • Paid Vacation
  • Sick Days
  • Sport/Insurance Compensation
  • English Classes
  • Charity
  • Training Compensation
Read More
Arrow Right
New

Senior Data Engineer

Our client is a global jewelry manufacturer undergoing a major transformation, m...
Location
Location
Turkey , Istanbul
Salary
Salary:
Not provided
zoolatech.com Logo
Zoolatech
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience as a Data Engineer with proven expertise in Azure Synapse Analytics and SQL Server
  • Advanced proficiency in SQL, covering relational databases, data warehousing, dimensional modeling, and cubes
  • Practical experience with Azure Data Factory, Databricks, and PySpark
  • Track record of designing, building, and delivering production-ready data products at enterprise scale
  • Strong analytical skills and ability to translate business requirements into technical solutions
  • Excellent communication skills in English, with the ability to adapt technical details for different audiences
  • Experience working in Agile/Scrum teams
Job Responsibility
Job Responsibility
  • Design, build, and maintain scalable, efficient, and reusable data pipelines and products on the Azure PaaS data platform
  • Collaborate with product owners, architects, and business stakeholders to translate requirements into technical designs and data models
  • Enable advanced analytics, reporting, and other data-driven use cases that support commercial initiatives and operational efficiencies
  • Ingest, transform, and optimize large, complex data sets while ensuring data quality, reliability, and performance
  • Apply DevOps practices, CI/CD pipelines, and coding best practices to ensure robust, production-ready solutions
  • Monitor and own the stability of delivered data products, ensuring continuous improvements and measurable business benefits
  • Promote a “build-once, consume-many” approach to maximize reuse and value creation across business verticals
  • Contribute to a culture of innovation by following best practices while exploring new ways to push the boundaries of data engineering
What we offer
What we offer
  • Paid Vacation
  • Hybrid Work (home/office)
  • Sick Days
  • Sport/Insurance Compensation
  • Holidays Day Off
  • English Classes
  • Training Compensation
  • Transportation compensation
Read More
Arrow Right
New

Senior Data Engineer

As a Senior Data Engineer, you will be pivotal in designing, building, and optim...
Location
Location
United States
Salary
Salary:
102000.00 - 125000.00 USD / Year
wpromote.com Logo
Wpromote
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent practical experience
  • 4+ years of experience in data engineering or a related field
  • Intermediate to advanced programming skills in Python
  • Proficiency in SQL and experience with relational databases
  • Strong knowledge of database and data warehousing design and management
  • Strong experience with DBT (data build tool) and test-driven development practices
  • Proficiency with at least 1 cloud database (e.g. BigQuery, Snowflake, Redshift, etc.)
  • Excellent problem-solving skills, project management habits, and attention to detail
  • Advanced level Excel and Google Sheets experience
  • Familiarity with data orchestration tools (e.g. Airflow, Dagster, AWS Glue, Azure data factory, etc.)
Job Responsibility
Job Responsibility
  • Developing data pipelines leveraging a variety of technologies including dbt and BigQuery
  • Gathering requirements from non-technical stakeholders and building effective solutions
  • Identifying areas of innovation that align with existing company and team objectives
  • Managing multiple pipelines across Wpromote’s client portfolio
What we offer
What we offer
  • Half-day Fridays year round
  • Unlimited PTO
  • Extended Holiday break (Winter)
  • Flexible schedules
  • Work from anywhere options*
  • 100% paid parental leave
  • 401(k) matching
  • Medical, Dental, Vision, Life, Pet Insurance
  • Sponsored life insurance
  • Short Term Disability insurance and additional voluntary insurance
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

As a senior member of our engineering team, you will take ownership of critical ...
Location
Location
Poland
Salary
Salary:
Not provided
userlane.com Logo
Userlane GmbH
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Minimum of 5 years of hands-on experience in designing and developing data processing systems
  • Experience being part of a team of software engineers and helping establish processes from scratch
  • Familiarity with DBMS like ClickHouse or a different SQL-based OLAP database
  • Experience with various data engineering tools like Airflow, Kafka, dbt
  • Experience building and maintaining applications with the following languages: Python, Golang, Typescript
  • Knowledge of container technologies like Docker and Kubernetes
  • Experience with CI/CD pipelines and automated testing
  • Ability to solve problems and balance structure with creativity
  • Ability to operate independently and apply strategic thinking with technical depth
  • Willingness to share information and skills with the team
Job Responsibility
Job Responsibility
  • Shape and maintain our various data and backend components - DBs, APIs and services
  • Understand business requirements and analyze their impact on the design of our software services and tools
  • Identify architectural changes needed in our infrastructure to support a smooth process of adding new features
  • Research, propose, and deliver changes to our software architecture to address our engineering and product requirements
  • Design, develop, and maintain a solid and stable RESTful API based on industry standards and best practices
  • Collaborate with internal and external teams to deliver software that fits the overall ecosystem of our products
  • Stay up to date with the new trends and technologies that enable us to work smarter, not harder
What we offer
What we offer
  • Team & Culture: A high-performance culture with great leadership and a fun, engaged, motivated, and diverse team with people from over 20 countries
  • Market: Userlane is among the global leaders in the rapidly growing Digital Adoption industry
  • Growth: We take you and your development seriously. You can expect weekly 121s, a personalised skills assessment and development plan, on the job coaching and a budget for events and training
  • Compensation: Significant financial upside with an attractive and incentivising package on B2B basis
  • Fulltime
Read More
Arrow Right
New

Senior Crypto Data Engineer

Token Metrics is seeking a multi-talented Senior Big Data Engineer to facilitate...
Location
Location
Vietnam , Hanoi
Salary
Salary:
Not provided
tokenmetrics.com Logo
Token Metrics
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's degree in Data Engineering, Big Data Analytics, Computer Engineering, or related field
  • A Master's degree in a relevant field is an added advantage
  • 3+ years of Python, Java or any programming language development experience
  • 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
  • 3+ years of experience with schema design and dimensional data modeling
  • Expert proficiency in SQL, NoSQL, Python, C++, Java, R
  • Expert with building Data Lake, Data Warehouse or suitable equivalent
  • Expert in AWS Cloud
  • Excellent analytical and problem-solving skills
  • A knack for independence and group work
Job Responsibility
Job Responsibility
  • Liaising with coworkers and clients to elucidate the requirements for each task
  • Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed
  • Reformulating existing frameworks to optimize their functioning
  • Testing such structures to ensure that they are fit for use
  • Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc
  • Preparing raw data for manipulation by Data Scientists
  • Implementing proper data validation and data reconciliation methodologies
  • Ensuring that your work remains backed up and readily accessible to relevant coworkers
  • Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs
  • Fulltime
Read More
Arrow Right
New

Senior Data Engineer

We are seeking a highly skilled and motivated Senior Data Engineer/s to architec...
Location
Location
India , Hyderabad
Salary
Salary:
Not provided
techmahindra.com Logo
Tech Mahindra
Expiration Date
January 30, 2026
Flip Icon
Requirements
Requirements
  • 7-10 years of experience in data engineering with a focus on Microsoft Azure and Fabric technologies
  • Strong expertise in: Microsoft Fabric (Lakehouse, Dataflows Gen2, Pipelines, Notebooks)
  • Strong expertise in: Azure Data Factory, Azure SQL, Azure Data Lake Storage Gen2
  • Strong expertise in: Power BI and/or other visualization tools
  • Strong expertise in: Azure Functions, Logic Apps, and orchestration frameworks
  • Strong expertise in: SQL, Python and PySpark/Scala
  • Experience working with structured and semi structured data (JSON, XML, CSV, Parquet)
  • Proven ability to build metadata driven architectures and reusable components
  • Strong understanding of data modeling, data governance, and security best practices
Job Responsibility
Job Responsibility
  • Design and implement ETL pipelines using Microsoft Fabric (Dataflows, Pipelines, Lakehouse ,warehouse, sql) and Azure Data Factory
  • Build and maintain a metadata driven Lakehouse architecture with threaded datasets to support multiple consumption patterns
  • Develop agent specific data lakes and an orchestration layer for an uber agent that can query across agents to answer customer questions
  • Enable interactive data consumption via Power BI, Azure OpenAI, and other analytics tools
  • Ensure data quality, lineage, and governance across all ingestion and transformation processes
  • Collaborate with product teams to understand data needs and deliver scalable solutions
  • Optimize performance and cost across storage and compute layers
Read More
Arrow Right
Welcome to CrawlJobs.com
Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.