CrawlJobs Logo

Middle Data Engineer

n-ix.com Logo

N-iX

Location Icon

Location:

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are seeking a motivated Data Engineer to join our team. In this role, you will be responsible for developing and maintaining robust data pipelines that drive our business intelligence and analytics.

Job Responsibility:

Developing and maintaining robust data pipelines that drive our business intelligence and analytics

Requirements:

  • 2+ years of experience in batch and streaming ETL using Spark, Python, Scala, Snowflake, or Databricks for Data Engineering or Machine Learning workloads
  • 2+ years orchestrating and implementing pipelines with workflow tools like Databricks Workflows, Apache Airflow, or Luigi
  • 2+ years of experience prepping structured and unstructured data for data science models
  • 2+ years of experience with containerization and orchestration technologies (Docker, Kubernetes discussable) and experience with shell scripting in Bash/Unix shell is preferable
  • Proficiency in Oracle & SQL and data manipulation techniques
  • Experience using machine learning in data pipelines to discover, classify, and clean data
  • Implemented CI/CD with automated testing in Jenkins, Github Actions, or Gitlab CI/CD
  • Familiarity with AWS Services not limited to Lambda, S3, and DynamoDB
  • Demonstrated experience implementing data management life cycle, using data quality functions like standardization, transformation, rationalization, linking, and matching
What we offer:
  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits

Additional Information:

Job Posted:
January 22, 2026

Work Type:
On-site work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Middle Data Engineer

Middle Data Engineer

At LeverX, we have had the privilege of delivering over 950 projects. With 20+ y...
Location
Location
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3–5 years of experience in data engineering
  • Strong SQL and solid Python for data processing
  • Hands-on experience with at least one cloud and a modern warehouse/lakehouse: Snowflake, Redshift, Databricks, or Apache Spark/Iceberg/Delta
  • Experience delivering on Data Warehouse or Lakehouse projects: star/snowflake modeling, ELT/ETL concepts
  • Familiarity with orchestration (Airflow, Prefect, or similar) and containerization fundamentals (Docker)
  • Understanding of data modeling, performance tuning, cost-aware architecture, and security/RBAC
  • English B1+
Job Responsibility
Job Responsibility
  • Design, build, and maintain batch/streaming pipelines (ELT/ETL) from diverse sources into DWH/Lakehouse
  • Model data for analytics (star/snowflake, slowly changing dimensions, semantic/metrics layers)
  • Write production-grade SQL and Python
  • optimize queries, file layouts, and partitioning
  • Implement orchestration, monitoring, testing, and CI/CD for data workflows
  • Ensure data quality (validation, reconciliation, observability) and document lineage
  • Collaborate with BI/analytics to deliver trusted, performant datasets and dashboards
What we offer
What we offer
  • Projects in different domains: Healthcare, manufacturing, e-commerce, fintech, etc.
  • Projects for every taste: Startup products, enterprise solutions, research & development projects, and projects at the crossroads of SAP and the latest web technologies
  • Global clients based in Europe and the US, including Fortune 500 companies
  • Employment security: We hire for our team, not just a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay in the company for 4+ years
  • Market-based compensation and regular performance reviews
  • Internal expert communities and courses
  • Perks to support your growth and well-being
Read More
Arrow Right

Senior Software Engineer - Trade Processing Middle Office Platform

As an experienced Staff / Senior Software Engineer, you’ll shape our flagship Mi...
Location
Location
United States , New York
Salary
Salary:
170000.00 - 240000.00 USD / Year
clearstreet.io Logo
Clear Street
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor's Degree in Computer Science or Engineering
  • 10+ years of strong proficiency in Java / Spring Boot, Spring, RDBMS, Service Oriented Architecture (SOA), microservice based server side application development
  • Strong experience with distributed systems, event-driven architecture, and tools like Kafka
  • Practical knowledge of relational databases (e.g., Postgres) and schema design
  • You have contributed to systems that deliver solutions to complex business problems that handle massive amounts of data
  • You prioritize end user experience and it shows in your API designs, functionality, and performance
  • You have a strong command over design patterns, data structures, and algorithms
  • You have strong problem-solving skills with a keen eye for performance optimization
  • You can clearly explain the nuances of system design and paradigms to engineers and stakeholders
  • Strong understanding of multi-threading, concurrency, and performance tuning
Job Responsibility
Job Responsibility
  • Architect and build highly available, horizontally scalable mission critical applications in a modern technology stack
  • Design, build, and optimize core components responsible for processing a high volume of trade data in a low latency environment
  • Solve complex performance and scalability challenges, ensuring our systems handle large-scale financial data efficiently
  • Collaborate with product managers, and other engineers to translate financial methodologies into robust software solutions
  • Lead by example in system design discussions, architectural trade-offs, and best practices
  • Mentor team members, contributing to a strong culture of engineering excellence
What we offer
What we offer
  • Competitive compensation, benefits, and perks
  • Company equity
  • 401k matching
  • Gender neutral parental leave
  • Full medical, dental and vision insurance
  • Lunch stipends
  • Fully stocked kitchens
  • Happy hours
  • Fulltime
Read More
Arrow Right

Middle Palantir Foundry Developer

At LeverX, we have had the privilege of delivering 1,500+ projects. With 20+ yea...
Location
Location
Uzbekistan, Georgia
Salary
Salary:
Not provided
leverx.com Logo
LeverX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years in data/analytics engineering or software development
  • Hands-on experience with Palantir Foundry (pipelines and/or applications)
  • Proficiency in Python and SQL
  • Confidence with Git
  • Ability to translate business requirements into working solutions
  • English B1+
Job Responsibility
Job Responsibility
  • Build and maintain data pipelines and transformations in Foundry
  • Implement application logic, views, and access controls
  • Validate data and ensure basic documentation and support
  • Work with stakeholders to clarify requirements and iterate on features
What we offer
What we offer
  • Impactful use-case delivery on real data
  • Possibility to progress into a Team Lead role: mentoring, design facilitation, and coordination
  • Projects in different domains: Healthcare, manufacturing, e-commerce, fintech, etc.
  • Projects for every taste: Startup products, enterprise solutions, research & development projects, and projects at the crossroads of SAP and the latest web technologies
  • Global clients based in Europe and the US, including Fortune 500 companies
  • Employment security: We hire for our team, not just for a specific project. If your project ends, we will find you a new one
  • Healthy work atmosphere: On average, our employees stay in the company for 4+ years
Read More
Arrow Right

Middle QA Big Data

The objective of this project is to enhance the QA processes through the impleme...
Location
Location
Ukraine
Salary
Salary:
Not provided
n-ix.com Logo
N-iX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 3+ years of QA experience with a strong focus on Big Data testing, particularly with hands-on experience in Data Lake environments on any cloud platform (preferably Azure)
  • Experience with Azure
  • Hands-on experience in Azure Data Factory, Azure Synapse Analytics, or similar services
  • Proficiency in SQL, capable of writing and optimizing both simple and complex queries for data validation and testing purposes
  • Experienced in PySpark, with experience in data manipulation and transformation, and a demonstrated ability to write and execute test scripts for data processing and validation (ability to understand the code and convert the logic to SQL)
  • Hands-on experience with Functional & System Integration Testing in big data environments, ensuring seamless data flow and accuracy across multiple systems
  • Knowledge and ability to design and execute test cases in a behavior-driven development environment
  • Fluency in Agile methodologies, with active participation in Scrum ceremonies and a strong understanding of Agile principles
  • Familiarity with tools like Jira, including experience with X-Ray for defect management and test case management
  • Proven experience working on high-traffic and large-scale software products, ensuring data quality, reliability, and performance under demanding conditions
Job Responsibility
Job Responsibility
  • Design and execute data validation tests to ensure completeness in Azure Data Lake Storage (ADLS), Azure Synapse, and Databricks
  • Verify data ingestion, transformation, and loading (ETL/ELT) processes in Azure Data Factory (ADF)
  • Validate data schema, constraints, and format consistency across different storage layers
  • Conduct performance testing on data pipelines
  • Optimize query performance by working with data engineers
  • Identify, log, and track defects in JIRA
  • Collaborate with Data Engineers and Business Analysts to resolve data inconsistencies
  • Generate detailed test reports, dashboards, and documentation for stakeholders
What we offer
What we offer
  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits
Read More
Arrow Right

Rust Engineer - Platform

As a Platform Backend Engineer (Rust) at Keyrock, you will drive the development...
Location
Location
Salary
Salary:
Not provided
keyrock.com Logo
Keyrock
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent experience
  • Proven experience in building and maintaining data-intensive, large-scale, high-performance trading data platforms
  • Strong expertise in Rust (or C++), Python, and TypeScript for system development and automation in the financial services industry
  • Good understanding of data engineering principles, including data modeling, ETL pipelines, and stream processing
  • Experience with financial services data workflows, including trading, middle office, and back office operations
  • Extensive experience in cloud-native architectures, with proficiency in AWS
  • Proficient in GitOps tools and methodologies for infrastructure automation and deployment
  • Strong background in DevSecFinOps, ensuring compliance, security, and cost efficiency across the development lifecycle
  • Hands-on experience with CI/CD pipelines, infrastructure as code (IaC), and monitoring tools
Job Responsibility
Job Responsibility
  • Rust Development: Design, build, and maintain high-performance backend services and APIs using Rust, ensuring low latency and high availability for critical trading data platforms
  • Strong systems engineering fundamentals: Concurrency, memory management, networking, serialization, and observability Solid understanding of performance tuning and profiling in real-world systems
  • System Integration: Create seamless integrations between live trading operations (exchanges/DeFi) and backoffice systems, automating workflows to improve operational efficiency
  • Cloud-Native Deployment: Deploy and manage services in a cloud-native environment, leveraging AWS, Kubernetes, and Terraform to scale infrastructure infrastructure-as-code
  • DevOps & Observability: Maintain GitOps-driven workflows, ensuring robust CI/CD pipelines and implementing deep system observability (logging, metrics, tracing) for rapid incident response
  • Database Optimization: Optimize data storage and retrieval strategies (SQL/NoSQL), balancing query performance, cost efficiency, and data integrity in a high-volume financial environment
  • Security & Compliance: Engineer solutions with a "Security-First" mindset, ensuring strict adherence to compliance standards and secure handling of sensitive financial data
  • Cross-Functional Collaboration: Partner with Product Managers, Risk teams, and other engineers to translate complex business requirements into reliable technical specifications and features
  • Technical Excellence: Actively participate in code reviews, contribute to architectural discussions, and mentor fellow engineers to foster a culture of high code quality and innovation
  • Continuous Improvement: Stay updated on emerging trends in the Rust ecosystem, cloud infrastructure, and blockchain technologies to continuously refine the platform’s capabilities
What we offer
What we offer
  • A competitive salary package
  • Autonomy in your time management thanks to flexible working hours and the opportunity to work remotely
  • The freedom to create your own entrepreneurial experience by being part of a team of people in search of excellence
  • Fulltime
Read More
Arrow Right
New

Middle Data Warehouse Engineer

We are seeking a motivated Middle Data Warehouse Engineer to join our team. In t...
Location
Location
Salary
Salary:
Not provided
n-ix.com Logo
N-iX
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • At least 4 years of experience in this or a similar role
  • Intermediate proficiency with Oracle and PL/SQL
  • Experience with ETL processes and data pipeline development
  • Familiarity with shell scripting (Unix) and basic command-line debugging
  • Working knowledge of AWS CLI
  • Basic knowledge of Java
  • A solid understanding of data warehousing concepts
  • Strong problem-solving skills and the ability to learn from technical documentation and training materials
Job Responsibility
Job Responsibility
  • Design, develop, and maintain ETL (Extract, Transform, Load) processes to move data from various sources to our data warehouse
  • Write and optimize complex queries and scripts using PL/SQL and Oracle to transform and load data
  • Automate and orchestrate data workflows using shell scripting (Unix/korn shell)
  • Utilize AWS CLI for tasks such as managing data in S3 or interacting with other AWS services
  • Debug and troubleshoot data pipeline issues to ensure data accuracy and availability for downstream consumers
  • Collaborate with stakeholders and team members to understand data requirements and deliver reliable solutions
What we offer
What we offer
  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits
Read More
Arrow Right

Applications Development Senior Group Manager

This role will be part of the Risk Data team and is a senior management level po...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong academic record, ideally with a Bachelor’s or Master’s degree in Computer Science or engineering or related technical discipline
  • Proven experience in enterprise application development with full stack technologies
  • Strong Architect and hands on technical experience in implementing large volume real time complex solutions in Big Data Platforms & Public Cloud platforms
  • Experience in Data architecture, strong Software development fundamentals, data structures, design patterns, object-oriented principles
  • Experience in design and delivery of multi-tiered applications and high performance server side components
  • Skills on system performance tuning, high performance, low latency, multithreading and experience with Java server side programming
  • Preferred experience in Handling high volumes of data and working with In-memory databases and Caching solutions
  • Experience of building and leading teams, ideally with a global resource profile and demonstrated ability to deliver large projects efficiently and on time
  • Significant experience in large Financial Services Technology services companies is expected for this position
  • Hands-on development, architecture and leadership experience in real-time data engineering platforms implementation
Job Responsibility
Job Responsibility
  • Lead the efforts in Institutional Data Platform (ICG) that span multiple businesses, products and functions
  • Delivery of Price Risk related Data initiatives and Capital reporting (GSIB) related deliverables
  • Establish strong relationships with the global business stakeholders and ensure transparency of project deliveries
  • Actively identify and manage risks and issues, working with disparate teams to create mitigation plans and follow-through to resolution
  • Adhere to all key Project Management (PMQC) & Engineering Excellence standards
  • Ensure timely communications to Senior Technology Management and Business Partners in Front Office, Middle Office & other Operations functions
  • Drive the design and development of system architecture, work with end-users of the systems, and enhance the quality of deliverables
  • Ensure staff follows Citi documented policy and procedures as well as maintain desktop procedures and supporting documentation for filings on a current basis and in comprehensive manner
  • Ensure change is managed with appropriate controls, documentation, and approvals including implementation of new and revised regulatory reporting requirements
  • Manage and maintain all disaster recovery plans, oversee appropriate testing, and provide permit-to-operate for new applications
What we offer
What we offer
  • 27 days annual leave (plus bank holidays)
  • A discretional annual performance related bonus
  • Private Medical Care & Life Insurance
  • Employee Assistance Program
  • Pension Plan
  • Paid Parental Leave
  • Special discounts for employees, family, and friends
  • Access to an array of learning and development resources
  • Fulltime
Read More
Arrow Right

Middle Python Engineer

We’re opening the position of a Middle Python Engineer (Python, pandas, FastAPI)...
Location
Location
Poland
Salary
Salary:
Not provided
edvantis.com Logo
Edvantis
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong knowledge of the Python programming language and the relevant ecosystem (boto3, AWS, Snowflake, SQL, HTTP, REST API, OAuth2, RegExp, Xpath, unit testing)
  • Understanding the infrastructure of AWS or Azure
  • Good knowledge of CI/CD pipelines and version control tools (e.g., Git, GitHub Actions, Azure DevOps)
  • Responsibility and attention to detail
  • focus on quality
  • Readiness to work with enabler assignments: researching open data sources and reports data review
  • Bachelor’s degree in Computer Science, Software Engineering, or a related field (or equivalent experience)
  • English level – Upper-Intermediate or higher (sufficient to handle conversation in video-off mode)
Job Responsibility
Job Responsibility
  • Develop and optimize scalable back-end services and APIs using Python
  • Design and implement AI-driven agents to enhance efficiency
  • Work with cloud platforms to deploy and manage AI-driven applications
  • Develop secure, high-performance microservices and data pipelines
  • Optimize database performance and ensure data consistency in SQL databases (e.g., PostgreSQL)
  • Collaborate with Front-End and AI/ML Teams to build end-to-end AI-powered solutions
  • Ensure best practices in CI/CD, DevOps, and cloud infrastructure management
  • Stay up to date with AI advancements and emerging back-end technologies
What we offer
What we offer
  • Remote-first work model with flexible working hours (we provide all equipment)
  • Comfortable and fully equipped offices in Lviv and Rzeszów
  • Competitive compensation with regular performance reviews
  • 18 paid vacation days per year + all state holidays
  • 12 days of paid sick leave per year without a medical certificate + extra paid leave for blood donation
  • Medical insurance with an affordable family coverage option
  • Mental health program which includes free and confidential consultations with a psychologist
  • English, German, and Polish language courses
  • Corporate subscription to learning platforms, regular meetups and webinars
  • Friendly team that values accountability, innovation, teamwork, and customer satisfaction
  • Fulltime
Read More
Arrow Right