This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
As a Senior Data Engineer, you will play a pivotal role in transforming data into actionable insights. Collaborate with our dynamic team of technologists to develop cutting-edge data solutions that drive innovation and fuel business growth. Your responsibilities will include managing complex data structures and delivering scalable and efficient data solutions. Your expertise in data engineering will be crucial in optimizing our data-driven decision-making processes. If you're passionate about leveraging data to make a tangible impact, we welcome you to join us in shaping the future of our organization.
Job Responsibility:
Create and maintain Data Platform pipelines
supporting structured, graph, and unstructured datasets
Architect and implement graph database models, schema design, and build robust, scalable solutions.
Fluency with data engineering concepts and platforms (AWS: S3, Lambda, SNS, SQS…
Iceberg), data platforms (Snowflake), configuration (data contracts), transformation, orchestration (dbt, Airflow), data quality (Great Expectations, Anomalo, Soda, Collibra).
Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for our team
Collaborate with product managers, architects, and other engineers to drive the success of the Core Data Platform
Document standards and best practices for pipeline configurations, naming conventions, etc.
Ensure high operational efficiency and quality of the Core Data Platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)
Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental technology improvements
Maintain detailed documentation of your work and changes to support data quality and data governance requirements
Requirements:
5+ years of data engineering experience developing data pipelines
Understanding core concepts of graph databases and its advantages over a traditional RDBMS for modeling data. Knowing use cases.
Proficiency in at least one major programming language (e.g., Python)
ETL development for graph databases (extracting or loading into a graph databases)
Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
Experience Neo4j with Snowflake
Strong algorithmic problem-solving expertise
Comfortable working in a fast-paced and highly collaborative environment.
Excellent written and verbal communication
Willingness and ability to learn and pick up new skill sets
Self-starting problem solver with an eye for detail and excellent analytical and communication skills
Familiar with Scrum and Agile methodologies
STEM BA Degree + 5 years relevant experience
Nice to have:
Focus on graph databases (e.g., Neo4j), but candidates with broader data engineering skills (including Snowflake) are preferred for flexibility.
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.