This list contains only the countries for which job offers have been published in the selected language (e.g., in the French version, only job offers written in French are displayed, and in the English version, only those in English).
We are seeking a talented Data Engineer to design, build, and maintain our data infrastructure supporting mission-critical energy operations. You'll work at the intersection of renewable energy and data technology, developing pipelines that process everything from real-time asset performance data to complex trading and risk analytics. This hybrid role offers the opportunity to make a direct impact on clean energy operations while working with cutting-edge data platforms, including Snowflake, Dagster, dbt, and Modal.
Job Responsibility:
Design, deploy, and maintain scalable data infrastructure to support enterprise analytics and reporting needs
Manage Snowflake instances, including performance tuning, security configuration, and capacity planning for growing data volumes
Optimize query performance and resource utilization to control costs and improve processing speed
Build and orchestrate complex ETL/ELT workflows using Dagster to ensure reliable, automated data processing for asset management and energy trading
Develop robust data pipelines that handle high-volume, time-sensitive energy market data and asset generation and performance metrics
Implement workflow automation and dependency management for critical business operations
Develop and maintain dbt models to transform raw data into business-ready analytical datasets and dimensional models
Create efficient SQL-based transformations for complex energy market calculations and asset performance metrics
Support advanced analytics initiatives through proper data preparation and feature engineering
Implement comprehensive data validation, testing, and monitoring frameworks to ensure accuracy and consistency across all energy and financial data assets
Establish data lineage tracking and privacy controls to meet regulatory compliance requirements in the energy sector
Develop alerting and monitoring systems for data pipelines, including error handling, SLA monitoring, and incident response
Lead continuous integration and deployment initiatives for Dagster and dbt pipelines, and Streamlit/Gradio application deployments to Linux servers
Implement automated testing and deployment automation for data pipelines and analytics applications
Manage version control and infrastructure as code practices
Partner with Analytics Engineers, Data Scientists, and business stakeholders to understand requirements and deliver solutions
Work closely with asset management and trading groups to ensure real-time data availability for market operations and risk calculations
Collaborate with credit risk teams to develop data models supporting financial analysis and regulatory reporting
Translate business requirements into technical solutions and communicate data insights to stakeholders
Create and maintain technical documentation, data dictionaries, and onboarding materials for data assets
Implement role-based access controls, data encryption, and security best practices across the data stack
Monitor and optimize cloud infrastructure costs, implement resource allocation strategies, and provide cost forecasting
Requirements:
2-4 years of hands-on data engineering experience in production environments
Bachelor's degree in Computer Science, Engineering, or a related field
Proficiency in Dagster or Airflow for pipeline scheduling, dependency management, and workflow automation
Advanced-level Snowflake administration, including virtual warehouses, clustering, security, and cost optimization
Proficiency in dbt for data modeling, testing, documentation, and version control of analytical transformations
Strong Python and SQL skills for data processing and automation
1-2+ years of experience with continuous integration and continuous deployment practices and tools (Git, GitHub Actions, GitLab CI, or similar)
Advanced SQL skills, database design principles, and experience with multiple database platforms
Proficiency in AWS/Azure/GCP data services, storage solutions (S3, Azure Blob, GCS), and infrastructure as code
Experience with APIs, streaming platforms (Kafka, Kinesis), and various data connectors and formats
Strong analytical and troubleshooting skills with attention to detail
Ability to work effectively with both technical and business stakeholders
Comfortable with a hybrid work environment (2 days in office)
Nice to have:
Experience in the energy, utilities, or financial services industries
Knowledge of energy trading concepts, market data, or asset management
Familiarity with data governance frameworks and regulatory compliance (SOX, GDPR, CCPA)
Experience with Streamlit or Gradio for analytics application development
Experience with infrastructure as code (Terraform, CloudFormation)
Understanding of data security best practices and access controls
What we offer:
generous PTO
medical, dental & vision care
HSAs with company contributions
health FSAs
dependent daycare FSAs
commuter benefits
relocation
a 401(k) plan with employer match
a variety of life & accident insurances
fertility programs
adoption assistance
generous parental leave
tuition reimbursement
benefits for employees in same-sex marriages, civil unions & domestic partnerships
annual cash bonus, subject to personal and company performance goals
Welcome to CrawlJobs.com – Your Global Job Discovery Platform
At CrawlJobs.com, we simplify finding your next career opportunity by bringing job listings directly to you from all corners of the web. Using cutting-edge AI and web-crawling technologies, we gather and curate job offers from various sources across the globe, ensuring you have access to the most up-to-date job listings in one place.
We use cookies to enhance your experience, analyze traffic, and serve personalized content. By clicking “Accept”, you agree to the use of cookies.