This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
At Aristocrat, we are committed to bringing happiness to life through the power of play, and we are seeking experienced individuals who share our passion and ambition. An outstanding opportunity awaits for a highly skilled Senior Data Engineer to join our exceptional team. As a Senior Data Engineer at Aristocrat, you will collaborate with industry experts to provide flawless, scalable data solutions. You will be pivotal in constructing robust data pipelines and ensuring the security and reliability of our data infrastructure. This is your chance to define a significant impact and join an organization that values innovation and excellence.
Job Responsibility:
Build, develop, and maintain batch and streaming data pipelines
Implement scalable data transformations using Python scripts and orchestrate workflows via Airflow or equivalent tools
Integrate with data platforms such as Snowflake, ensuring efficient data storage and retrieval
Generate and maintain large databases from various sources of structured and unstructured data
Write optimized SQL and Python scripts for data manipulation and ETL processes
Maintain data quality, observability, and pipeline reliability through monitoring and alerting
Collaborate with analytics and business teams to deliver high-impact data solutions
Follow guidelines for version control, documentation, and CI/CD in a collaborative environment
Requirements:
3+ years of hands-on experience in data engineering
Strong experience with Snowflake or similar cloud data warehouses
Strong experience with Airflow or other orchestration tools
Strong experience with Python, SQL, and shell scripting
Hands-on experience with AWS, Azure, or GCP (PubSub, GCS) services
Good understanding of data architecture, security, and performance tuning
Familiarity with version control (e.g., Git), CI/CD tools, and agile workflows
Understanding of how to effectively coordinate and model data for storage, retrieval, and analysis
Nice to have:
Experience with data streaming tools
Exposure to infrastructure-as-code tools like Terraform
Prior experience working with fast-paced product or analytics teams
Familiarity with BI tools (Power BI, Looker, Tableau)