CrawlJobs Logo

Data Test Engineer

optisolbusiness.com Logo

OptiSol Business Solutions

Location Icon

Location:
India , Chennai

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

We are looking for a skilled Data Test Engineer who can design, build, and validate end-to-end data pipelines. In this role, you will work closely with data engineers and business teams to ensure that data is accurate, complete, and reliable. You will be responsible for testing data workflows, writing complex SQL queries, automating data quality checks, and integrating validations into CI/CD pipelines. If you have a strong background in data engineering and a keen eye for quality, we’d love to have you on our team.

Job Responsibility:

  • Design, develop, and maintain robust ETL/ELT pipelines to process large volumes of structured and unstructured data using Azure Data Factory, PySpark, and SQL-based tools
  • Collaborate with data architects and analysts to understand transformation requirements and implement business rules correctly
  • Develop and execute complex SQL queries to validate, transform, and performance-tune data workflows
  • Perform rigorous data validation including source-to-target mapping (S2T), data profiling, reconciliation, and transformation rule testing
  • Conduct unit, integration, regression, and performance testing for data pipelines and storage layers
  • Automate data quality checks using Python and frameworks like Great Expectations, DBT, or custom-built tools
  • Monitor data pipeline health and implement observability through logging, alerting, and dashboards
  • Integrate testing into CI/CD workflows using tools like Azure DevOps, Jenkins, or GitHub Actions
  • Troubleshoot and resolve data quality issues, schema changes, and pipeline failures
  • Ensure compliance with data privacy, security, and governance policies
  • Maintain thorough documentation for data flows, test logic, and validation processes

Requirements:

  • 4+ years of experience in Data Engineering and Data/ETL Testing
  • Strong expertise in writing and optimizing SQL queries (joins, subqueries, window functions, performance tuning)
  • Proficiency in Python or PySpark for data transformation and automation
  • Hands-on experience with ETL tools such as Azure Data Factory, Talend, SSIS, or Informatica
  • Familiarity with cloud platforms, preferably Azure
  • AWS or GCP is a plus
  • Experience working with data lakes, data warehouses (Snowflake, BigQuery, Redshift), and modern data platforms
  • Knowledge of version control systems (Git), issue tracking tools (JIRA), and Agile methodologies
  • Exposure to data testing frameworks like Great Expectations, DBT tests, or custom validation tools
  • Experience integrating data testing into CI/CD pipelines

Nice to have:

Familiarity with Airflow, Databricks, BI tools (Power BI, Tableau), and metadata management practices

What we offer:
  • Competitive salary aligned with industry standards
  • Hands-on experience with enterprise-scale data platforms and cloud-native tools
  • Opportunities to work on data-centric initiatives across AI, analytics, and enterprise transformation
  • Access to internal learning accelerators, mentorship, and career growth programs
  • Flexible work culture, wellness initiatives, and comprehensive health benefits

Additional Information:

Job Posted:
January 05, 2026

Employment Type:
Fulltime
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Test Engineer

Data Test Engineer

We are looking for a skilled Data Test Engineer who can design, build, and valid...
Location
Location
India , Chennai
Salary
Salary:
Not provided
optisolbusiness.com Logo
OptiSol Business Solutions
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 4+ years of experience in Data Engineering and Data/ETL Testing
  • Strong expertise in writing and optimizing SQL queries (joins, subqueries, window functions, performance tuning)
  • Proficiency in Python or PySpark for data transformation and automation
  • Hands-on experience with ETL tools such as Azure Data Factory, Talend, SSIS, or Informatica
  • Familiarity with cloud platforms, preferably Azure
  • AWS or GCP is a plus
  • Experience working with data lakes, data warehouses (Snowflake, BigQuery, Redshift), and modern data platforms
  • Knowledge of version control systems (Git), issue tracking tools (JIRA), and Agile methodologies
  • Exposure to data testing frameworks like Great Expectations, DBT tests, or custom validation tools
  • Experience integrating data testing into CI/CD pipelines
Job Responsibility
Job Responsibility
  • Design, develop, and maintain robust ETL/ELT pipelines to process large volumes of structured and unstructured data using Azure Data Factory, PySpark, and SQL-based tools
  • Collaborate with data architects and analysts to understand transformation requirements and implement business rules correctly
  • Develop and execute complex SQL queries to validate, transform, and performance-tune data workflows
  • Perform rigorous data validation including source-to-target mapping (S2T), data profiling, reconciliation, and transformation rule testing
  • Conduct unit, integration, regression, and performance testing for data pipelines and storage layers
  • Automate data quality checks using Python and frameworks like Great Expectations, DBT, or custom-built tools
  • Monitor data pipeline health and implement observability through logging, alerting, and dashboards
  • Integrate testing into CI/CD workflows using tools like Azure DevOps, Jenkins, or GitHub Actions
  • Troubleshoot and resolve data quality issues, schema changes, and pipeline failures
  • Ensure compliance with data privacy, security, and governance policies
What we offer
What we offer
  • Competitive salary aligned with industry standards
  • Hands-on experience with enterprise-scale data platforms and cloud-native tools
  • Opportunities to work on data-centric initiatives across AI, analytics, and enterprise transformation
  • Access to internal learning accelerators, mentorship, and career growth programs
  • Flexible work culture, wellness initiatives, and comprehensive health benefits
  • Fulltime
Read More
Arrow Right

Software Engineer - Data Engineering

Akuna Capital is a leading proprietary trading firm specializing in options mark...
Location
Location
United States , Chicago
Salary
Salary:
130000.00 USD / Year
akunacapital.com Logo
AKUNA CAPITAL
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • BS/MS/PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 5+ years of professional experience developing software applications
  • Java/Scala experience required
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Prior hands-on experience with data platforms and technologies such as Delta Lake, Spark, Kubernetes, Kafka, ClickHouse, and/or Presto/Trino
  • Experience building large-scale batch and streaming pipelines with strict SLA and data quality requirements
  • Must possess excellent communication, analytical, and problem-solving skills
  • Recent hands-on experience with AWS Cloud development, deployment and monitoring necessary
  • Demonstrated experience working on an Agile team employing software engineering best practices, such as GitOps and CI/CD, to deliver complex software projects
  • The ability to react quickly and accurately to rapidly changing market conditions, including the ability to quickly and accurately respond and/or solve math and coding problems are essential functions of the role
Job Responsibility
Job Responsibility
  • Work within a growing Data Engineering division supporting the strategic role of data at Akuna
  • Drive the ongoing design and expansion of our data platform across a wide variety of data sources, supporting an array of streaming, operational and research workflows
  • Work closely with Trading, Quant, Technology & Business Operations teams throughout the firm to identify how data is produced and consumed, helping to define and deliver high impact projects
  • Build and deploy batch and streaming pipelines to collect and transform our rapidly growing Big Data set within our hybrid cloud architecture utilizing Kubernetes/EKS, Kafka/MSK and Databricks/Spark
  • Mentor junior engineers in software and data engineering best practices
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Build automated data validation test suites that ensure that data is processed and published in accordance with well-defined Service Level Agreements (SLA’s) pertaining to data quality, data availability and data correctness
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
What we offer
What we offer
  • Discretionary performance bonus
  • Comprehensive benefits package that may encompass employer-paid medical, dental, vision, retirement contributions, paid time off, and other benefits
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer I, Data Infrastructure

Khan Academy is partnering with SafeInsights to provide researchers access to st...
Location
Location
United States , Mountain View
Salary
Salary:
137871.00 - 172339.00 USD / Year
khanacademy.org Logo
Khan Academy
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of experience in data engineering within a production environment
  • Experience with Google infrastructure services and data services particularly GCS & Big Query
  • Advanced knowledge of Python
  • Strong SQL skills
  • Demonstrated proficiency with Docker and Kubernetes
  • Experience with Airflow
  • Background in applied math, statistics, economics, or a related technical field
  • Motivated by the Khan Academy mission “to provide a free world-class education for anyone, anywhere"
  • Proven cross-cultural competency skills demonstrating self-awareness, awareness of other, and the ability to adopt inclusive perspectives, attitudes, and behaviors to drive inclusion and belonging throughout the organization.
Job Responsibility
Job Responsibility
  • Collaborate with our Data Insights Group to understand the desired end data product and the overall projects goals
  • Collaborate with our data architect and Data Infrastructure team to understand available data source and tooling
  • Propose tooling and processes that will work with extant Khan Academy systems to meet the projects goals
  • Build test and refine these tools and processes
  • Deliver the SafeInsights container and associated project deliverables
  • Provide documentation and training required to maintain and enhance the above
What we offer
What we offer
  • Competitive salaries
  • Ample paid time off as needed
  • Remote-first culture
  • Generous parental leave
  • An exceptional team
  • The chance to put your talents towards a deeply meaningful mission
  • Opportunities to connect through affinity, ally, and social groups
  • 401(k) + 4% matching
  • Comprehensive insurance, including medical, dental, vision, and life
  • Fulltime
Read More
Arrow Right

Electrical Engineer II, RF Test Engineer

As an RF Test Engineer, your work will be essential in ensuring that the wireles...
Location
Location
United States , Scottsdale
Salary
Salary:
Not provided
axon.com Logo
Axon
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor’s degree in Electrical Engineering, RF Engineering or similar discipline
  • Minimum of 3 years of hands-on RF and wireless device testing experience
  • Minimum of 3 years of wireless device test plan development for evaluating performance of various wireless interfaces against system and product requirements
  • Solid understanding of basic RF principles and theory, familiar with basic RF technology circuitry stages and topologies
  • Experience with modern RF technology including multiband Wi-Fi, Bluetooth, Cellular (LTE, 5G FR1), and multi-constellation GNSS and their test methodologies
  • Cellular experience is a must-have
  • High level of familiarity of cellular certification processes, test requirements, and test methods (PTCRB, CTIA, GCF, Carrier testing, etc), with hands on experience with device pretesting
  • Familiarity of worldwide EMC and intentional radiator regulatory standards and test methods (FCC, CE, ETSI, etc)
  • Proficient with operation of RF equipment such as RF communication testers (CMW, CMX, Litepoint), Network Analyzers, Spectrum Analyzers, RF power meters, Signal Generators, RF Test chambers
  • Experience with collaborative RF debug at device and circuit level
Job Responsibility
Job Responsibility
  • Design and implement test plans to evaluate the performance of wirelessly connected devices, ensuring they meet stringent product and regulatory requirements
  • Record RF test data and organize results, creating clear, actionable reports
  • Present findings and recommendations to cross-functional teams, enabling swift resolution of design challenges
  • Evaluate RF performance characteristics such as Total Radiated Power (TRP), Total Isotropic Sensitivity (TIS), and other over-the-air (OTA) metrics for cellular-enabled wireless devices using advanced equipment
  • Use instruments like spectrum analyzers and radio communication testers to assess conducted performance of wireless technologies, including LTE, 5G, Wi-Fi, Bluetooth, and GNSS
  • Support product field evaluations to validate RF performance under diverse real-world conditions, such as urban or rural environments
  • Investigate and debug issues such as poor sensitivity, low transmit power, or signal interference in wireless devices
  • Assist in the regulatory pre-certification and formal certification test efforts for hardware by preparing test samples, working closely with external test labs, troubleshooting issues, and ensuring successful completion of certification activities
  • Expand your knowledge of RF systems, testing methodologies, and certification standards while working closely with experienced engineers to develop your technical expertise and contribute to innovative designs
What we offer
What we offer
  • Competitive salary and 401k with employer match
  • Discretionary paid time off
  • Paid parental leave for all
  • Medical, Dental, Vision plans
  • Fitness Programs
  • Emotional & Mental Wellness support
  • Learning & Development programs
  • Fulltime
Read More
Arrow Right

Data Engineer, Enterprise Data, Analytics and Innovation

Are you passionate about building robust data infrastructure and enabling innova...
Location
Location
United States
Salary
Salary:
110000.00 - 125000.00 USD / Year
vaniamgroup.com Logo
Vaniam Group
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5+ years of professional experience in data engineering, ETL, or related roles
  • Strong proficiency in Python and SQL for data engineering
  • Hands-on experience building and maintaining pipelines in a lakehouse or modern data platform
  • Practical understanding of Medallion architectures and layered data design
  • Familiarity with modern data stack tools, including: Spark or PySpark
  • Workflow orchestration (Airflow, dbt, or similar)
  • Testing and observability frameworks
  • Containers (Docker) and Git-based version control
  • Excellent communication skills, problem-solving mindset, and a collaborative approach
Job Responsibility
Job Responsibility
  • Design, build, and operate reliable ETL and ELT pipelines in Python and SQL
  • Manage ingestion into Bronze, standardization and quality in Silver, and curated serving in Gold layers of our Medallion architecture
  • Maintain ingestion from transactional MySQL systems into Vaniam Core to keep production data flows seamless
  • Implement observability, data quality checks, and lineage tracking to ensure trust in all downstream datasets
  • Develop schemas, tables, and views optimized for analytics, APIs, and product use cases
  • Apply and enforce best practices for security, privacy, compliance, and access control, ensuring data integrity across sensitive healthcare domains
  • Maintain clear and consistent documentation for datasets, pipelines, and operating procedures
  • Lead the integration of third-party datasets, client-provided sources, and new product-generated data into Vaniam Core
  • Partner with product and innovation teams to build repeatable processes for onboarding new data streams
  • Ensure harmonization, normalization, and governance across varied data types (scientific, engagement, operational)
What we offer
What we offer
  • 100% remote environment with opportunities for local meet-ups
  • Positive, diverse, and supportive culture
  • Passionate about serving clients focused on Cancer and Blood diseases
  • Investment in you with opportunities for professional growth and personal development through Vaniam Group University
  • Health benefits – medical, dental, vision
  • Generous parental leave benefit
  • Focused on your financial future with a 401(k) Plan and company match
  • Work-Life Balance and Flexibility
  • Flexible Time Off policy for rest and relaxation
  • Volunteer Time Off for community involvement
  • Fulltime
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our team.
Location
Location
Salary
Salary:
Not provided
bostondatapro.com Logo
Boston Data Pro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data Engineering: 8 years (Preferred)
  • Data Programming languages: 5 years (Preferred)
  • Data Developers: 5 years (Preferred)
Job Responsibility
Job Responsibility
  • Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction
  • Ensures quality of technical solutions as data moves across multiple zones and environments
  • Provides insight into the changing data environment, data processing, data storage and utilization requirements for the company, and offer suggestions for solutions
  • Ensures managed analytic assets to support the company’s strategic goals by creating and verifying data acquisition requirements and strategy
  • Develops, constructs, tests, and maintains architectures
  • Aligns architecture with business requirements and use programming language and tools
  • Identifies ways to improve data reliability, efficiency, and quality
  • Conducts research for industry and business questions
  • Deploys sophisticated analytics programs, machine learning, and statistical methods to efficiently implement solutions
  • Prepares data for predictive and prescriptive modeling and find hidden patterns using data
Read More
Arrow Right

Sr. Data Engineer

We are looking for a Sr. Data Engineer to join our team.
Location
Location
Salary
Salary:
Not provided
bostondatapro.com Logo
Boston Data Pro
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Data Engineering: 8 years (Preferred)
  • Data Programming languages: 5 years (Preferred)
  • Data Developers: 5 years (Preferred)
Job Responsibility
Job Responsibility
  • Designs and implements standardized data management procedures around data staging, data ingestion, data preparation, data provisioning, and data destruction
  • Ensures quality of technical solutions as data moves across multiple zones and environments
  • Provides insight into the changing data environment, data processing, data storage and utilization requirements for the company, and offer suggestions for solutions
  • Ensures managed analytic assets to support the company’s strategic goals by creating and verifying data acquisition requirements and strategy
  • Develops, constructs, tests, and maintains architectures
  • Aligns architecture with business requirements and use programming language and tools
  • Identifies ways to improve data reliability, efficiency, and quality
  • Conducts research for industry and business questions
  • Deploys sophisticated analytics programs, machine learning, and statistical methods to efficiently implement solutions
  • Prepares data for predictive and prescriptive modeling and find hidden patterns using data
Read More
Arrow Right

Principal Data Engineer

PointClickCare is searching for a Principal Data Engineer who will contribute to...
Location
Location
United States
Salary
Salary:
183200.00 - 203500.00 USD / Year
pointclickcare.com Logo
PointClickCare
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Principal Data Engineer with at least 10 years of professional experience in software or data engineering, including a minimum of 4 years focused on streaming and real-time data systems
  • Proven experience driving technical direction and mentoring engineers while delivering complex, high-scale solutions as a hands-on contributor
  • Deep expertise in streaming and real-time data technologies, including frameworks such as Apache Kafka, Flink, and Spark Streaming
  • Strong understanding of event-driven architectures and distributed systems, with hands-on experience implementing resilient, low-latency pipelines
  • Practical experience with cloud platforms (AWS, Azure, or GCP) and containerized deployments for data workloads
  • Fluency in data quality practices and CI/CD integration, including schema management, automated testing, and validation frameworks (e.g., dbt, Great Expectations)
  • Operational excellence in observability, with experience implementing metrics, logging, tracing, and alerting for data pipelines using modern tools
  • Solid foundation in data governance and performance optimization, ensuring reliability and scalability across batch and streaming environments
  • Experience with Lakehouse architectures and related technologies, including Databricks, Azure ADLS Gen2, and Apache Hudi
  • Strong collaboration and communication skills, with the ability to influence stakeholders and evangelize modern data practices within your team and across the organization
Job Responsibility
Job Responsibility
  • Lead and guide the design and implementation of scalable streaming data pipelines
  • Engineer and optimize real-time data solutions using frameworks like Apache Kafka, Flink, Spark Streaming
  • Collaborate cross-functionally with product, analytics, and AI teams to ensure data is a strategic asset
  • Advance ongoing modernization efforts, deepening adoption of event-driven architectures and cloud-native technologies
  • Drive adoption of best practices in data governance, observability, and performance tuning for streaming workloads
  • Embed data quality in processing pipelines by defining schema contracts, implementing transformation tests and data assertions, enforcing backward-compatible schema evolution, and automating checks for freshness, completeness, and accuracy across batch and streaming paths before production deployment
  • Establish robust observability for data pipelines by implementing metrics, logging, and distributed tracing for streaming jobs, defining SLAs and SLOs for latency and throughput, and integrating alerting and dashboards to enable proactive monitoring and rapid incident response
  • Foster a culture of quality through peer reviews, providing constructive feedback and seeking input on your own work
What we offer
What we offer
  • Benefits starting from Day 1!
  • Retirement Plan Matching
  • Flexible Paid Time Off
  • Wellness Support Programs and Resources
  • Parental & Caregiver Leaves
  • Fertility & Adoption Support
  • Continuous Development Support Program
  • Employee Assistance Program
  • Allyship and Inclusion Communities
  • Employee Recognition … and more!
  • Fulltime
Read More
Arrow Right