This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
This position offers you the opportunity to join a fast-growing technology organization that is redefining productivity paradigms in the software engineering industry. Thanks to our flexible, distributed model of global operation and the high caliber of our experts, we have enjoyed triple digit growth over the past five years, creating amazing career opportunities for our people. If you want to accelerate your career working with like-minded subject matter experts, solving interesting problems and building the products of tomorrow, this opportunity is for you. As a Snowflake Data Engineer at Parser, you will be part of our team and work on challenging engineering projects. You will help improve data processes and tooling, automating workloads and pipelines wherever possible. Moreover, we expect that you provide our client with your professional expertise, not only hands-on but also for technically improving the “under development” data framework.
Job Responsibility:
Data Pipeline Design, Implementation, Optimization and productionization in snowflake
Assemble large, complex data sets that meet functional / non-functional business requirements
Create and maintain datasets that support the needs and products
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability
Implementing processes oriented to improve quality, consistency, and reliability through the various pipelines (monitoring, retry, failure detection)
Requirements:
MS or BS in CS, Engineering, Math, Statistics, or a related field or equivalent practical experience in data engineering
Proven track record within a Data Engineer or engineering environment where you have developed and deployed software / pipeline
5+ years of experience working in data engineering using Python or any other language programming known for data engineering (Scala, Go, R, Java, etc)
3+ years of experience working in data engineering using Snowflake
Experience using the data warehousing tool: Snowflake
Understanding about several tools for data transformation and pipelining, like Airflow, DBT, Spark, Pandas
Cloud experience: Proficient in AWS, with expertise in data and analytics services such as Redshift, Kinesis, Glue, Step Functions, Sagemaker, RDS, etc
Excellent English communication skills
Requirement to be located in Colombia or Argentina (Cordoba or Buenos Aires)
What we offer:
The chance to work in innovative projects with leading brands, that use the latest technologies that fuel transformation
The opportunity to be part of an amazing, multicultural community of tech experts
The opportunity to grow and develop your career with the company