CrawlJobs Logo

Data Quality Engineer Tester

nttdata.com Logo

NTT DATA

Location Icon

Location:
India , Remote

Category Icon

Job Type Icon

Contract Type:
Not provided

Salary Icon

Salary:

Not provided

Job Description:

The Data Quality Engineer Tester role involves validating data sources and ensuring data integrity through rigorous testing. The ideal candidate will have strong proficiency in SQL and Python, with experience in ETL testing and automated testing frameworks. Responsibilities include designing test cases, managing defects, and conducting performance testing. A minimum of 2 years of experience is required, with a focus on data quality assurance and data governance.

Job Responsibility:

  • Data Validation (ETL Testing): Validating data sources, extraction, transformation logic, and loading (ETL) to ensure data integrity
  • Pipeline Testing: Designing and executing test cases for data pipelines (e.g., in AWS Glue, Azure Data Factory, or Apache Airflow) to ensure smooth, secure, and accurate data flow
  • Test Automation: Developing automated test scripts using Python, PySpark, or SQL to validate large datasets and reduce manual testing effort
  • Data Quality Rules & Checks: Creating and enforcing checks based on data completeness, transformation logic, and data mapping documents
  • Defect Management: Identifying, documenting, and tracking data-related defects, and working with engineers to troubleshoot root causes
  • Performance Testing: Testing for scalability and performance bottlenecks to ensure data systems handle high-volume data

Requirements:

  • 2-4 years SQL Mastery: Strong proficiency in writing complex SQL queries for backend data validation, including joins, subqueries, and aggregations
  • 2-4 years Scripting/Programming: data manipulation, automation, and testing
  • 2-4 years ETL & Database Knowledge: Understanding of data warehousing, data modeling (star/snowflake schemas), and ETL tools (e.g., Informatica, Talend)
  • 2-4 years Big Data Frameworks: Familiarity with distributed computing systems like Apache Spark or Hadoop
  • 2-4 years Cloud Platforms: Familiarity with cloud data services AWS S3/Redshift
  • 2-4 years Testing Tools: Experience with testing tools like Great Expectations, dbt, or automated testing frameworks

Additional Information:

Job Posted:
February 14, 2026

Employment Type:
Fulltime
Work Type:
Remote work
Job Link Share:

Looking for more opportunities? Search for other job offers that match your skills and interests.

Briefcase Icon

Similar Jobs for Data Quality Engineer Tester

Sustenance Engineer - Actian Data Platform

This position is responsible for sustenance of Avalanche Cloud Datawarehouse. Th...
Location
Location
India , Bangalore; Pune
Salary
Salary:
Not provided
actian.com Logo
Actian
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Prior experience in building scalable Cloud applications on AWS/Azure/GCP
  • At least 5 years’ experience with Node.js, Scala or Java programming environments
  • Containerization experience with Kubernetes
  • Advanced design/debugging/coding skills for distributed systems, microservices architecture and REST APIs
  • Complex problem-solving skills
  • Good written and oral communication skills
  • Organizational skills
  • Analytical skills
  • Technical skills
  • Quality management
Job Responsibility
Job Responsibility
  • Design and implement complex units/modules/products that meet functional and business requirements
  • Develop plans outlining steps for developing features and communicate status
  • Write and maintain documentation for program development, logic, coding, testing, changes, and corrections
  • Perform unit/module testing of software to find errors and confirm specifications
  • Fix bugs and add enhancements
  • Participate in design and code reviews
  • Assist Quality Assurance Team by providing assistance to testers and support personnel
  • Review and approve software testing plans for quality assurance
  • Provide input to establish and improve departmental processes and procedures
  • Provide product content to Technical Writers
  • Fulltime
Read More
Arrow Right
New

Senior Data Manual Tester

The Senior Data/Manual Tester plays a key role in establishing and maintaining d...
Location
Location
United Kingdom , London
Salary
Salary:
Not provided
solirius.com Logo
Solirius Consulting
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Strong understanding of data quality principles (accuracy, completeness, consistency, timeliness, validity)
  • Proven experience in data testing / data quality assurance, including manual validation techniques
  • Experience managing data quality issues, defect logs, and remediation workflows
  • Ability to define data quality rules and validation requirements for automated controls
  • Working knowledge of data platforms, data warehouses, and data pipelines
  • Ability to collaborate effectively with data engineering and analytics teams
  • Experience analysing data quality results and producing clear reporting and insights
  • Strong stakeholder engagement, communication, and prioritisation skills
  • Demonstrate a strong understanding of Agile methodologies
Job Responsibility
Job Responsibility
  • Define and maintain data quality frameworks, standards, and validation rules
  • Manage data quality issue logs, including prioritisation, tracking, and resolution
  • Design and coordinate data quality validation checks (manual and automated)
  • Partner with Data Analytics Engineers to define requirements for automated data quality controls
  • Review data quality outcomes, analyse trends, and report issues and risks
  • Collaborate with business and data teams to resolve data quality issues and drive improvements
  • Analysing test requirements, designing and producing reusable test scripts
  • Creating and managing test regression packs
  • Overcoming obstacles, such as working with the team on clarification of requirements or seeing bugs through to resolution to deliver quality testing on time
  • Analysing test results and updating test suites as needed to ensure accuracy and coverage
What we offer
What we offer
  • Competitive Salary
  • 25 Days Annual Leave + Bank Holidays
  • Flexibility to work from home
  • 10 days allocated for development training per year
  • Generous Discretionary Bonus
  • Statutory & Contributory Pension
  • Private Healthcare Cover
  • Discounted Gym Membership
  • Enhanced Parental Leave
  • Paid Fertility Leave
  • Fulltime
Read More
Arrow Right

Senior Engineer, Equipment Engineering

Sandisk understands how people and businesses consume data and we relentlessly i...
Location
Location
Malaysia , Batu Kawan
Salary
Salary:
Not provided
sandisk.com Logo
Sandisk
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Bachelor of Engineering (Hons) in Electrical / Electronics/ Mechanical / Mechatronic
  • 3+ years of experience in test equipment engineering, specifically in SSD manufacturing or a related industry, with a strong emphasis on sustaining and maintenance
  • Proven track record of maintaining high slot-on availability and first pass yield in a high-volume production environment
  • Experience in utilizing data analytics for equipment monitoring, troubleshooting, and maintenance optimization
  • Expertise in maintaining and troubleshooting SSD test equipment, with a deep understanding of calibration, reliability, and maintenance protocols
  • Proficiency in data analysis tools (e.g., Python, R, SQL, MATLAB) for monitoring and optimizing equipment performance
  • Strong knowledge of predictive maintenance techniques, including AI-driven approaches to equipment reliability
  • Familiarity with SPC, DOE, and Six Sigma methodologies for continuous improvement
  • Strong leadership skills with the ability to mentor and guide junior engineers and technicians
  • Excellent problem-solving and decision-making abilities, with a focus on data-driven solutions
Job Responsibility
Job Responsibility
  • Ensure that test equipment is functioning optimally, achieving >99.3% slot-on availability and maintaining a high first pass yield
  • Perform routine maintenance, calibration, and troubleshooting of test systems to minimize downtime and ensure high reliability
  • Monitor real-time and historical data to detect issues, reduce failures, and maximize equipment uptime
  • Use data analytics to monitor test equipment performance, identify patterns, and anticipate potential issues before they impact production
  • Analyze test equipment data to improve test efficiency, reduce down times, and enhance tester performance
  • Implement real-time monitoring and diagnostic tools to ensure sustained optimal performance of test equipment
  • Analyze equipment performance data to optimize maintenance schedules and extend the life cycle of test equipment
  • Lead continuous improvement projects focused on sustaining and enhancing the efficiency and reliability of test equipment
  • Lead and mentor a team of engineers and technicians in sustaining tester, sharing best practices for maintenance and troubleshooting
  • Work closely with production, quality, and other cross-functional teams to ensure seamless operation and continuous improvement in testing
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a talented and experienced Data Architect to join our team. The D...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Full-Time bachelor’s or master’s degree in engineering/technology, computer science, information technology, or related fields
  • 10+ years of total experience in data modeling and database design
  • experience in Retail domain will be added advantage
  • 8+ years of experience in data engineering development and support
  • 3+ years of experience in leading technical team of data engineers and BI engineers
  • Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools
  • Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks
  • Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions
  • Strong communication, interpersonal, collaboration skills along with leadership capabilities
  • Ability to work effectively in a fast-paced, dynamic environment as cloud SME
Job Responsibility
Job Responsibility
  • Collaborate with solution architect, data engineers, business stakeholders, business analysts, and DQ testers to ensure data management and data governance framework is defined as critical components
  • Design and develop data models using industry-standard modeling techniques and tools
  • Perform data profiling, data lineage and analysis to understand data quality, structure, and relationships
  • Optimize data models for performance, scalability, and usability by creating optimal data storage layer
  • Define and enforce data modeling standards, best practices, and guidelines
  • Participate in data governance initiatives to ensure compliance with data management policies and standards
  • Work closely with database administrators and developers to implement data models in relational and non-relational database systems
  • Conduct data model reviews and provide recommendations for improvements
  • Stay updated on emerging trends and technologies in data modeling and data management
  • Conduct continuous audits of data management system performance and refine where necessary
  • Fulltime
Read More
Arrow Right

Data Architect

We are seeking a talented and experienced Data Architect/ Modeller to join our t...
Location
Location
India , Gurugram
Salary
Salary:
Not provided
https://www.circlek.com Logo
Circle K
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • Full-Time bachelor’s or master’s degree in engineering/technology, computer science, information technology, or related fields
  • 10+ years of total experience in data modeling and database design and experience in Retail domain will be added advantage
  • 8+ years of experience in data engineering development and support
  • 3+ years of experience in leading technical team of data engineers and BI engineers
  • Proficiency in data modeling tools such as Erwin, ER/Studio, or similar tools
  • Strong knowledge of Azure cloud infrastructure and development using SQL/Python/PySpark using ADF, Synapse and Databricks
  • Hands-on experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Python/PySpark, Logic Apps, Key Vault, and Azure functions
  • Strong communication, interpersonal, collaboration skills along with leadership capabilities
  • Ability to work effectively in a fast-paced, dynamic environment as cloud SME
  • Act as single point of contact for all kinds of data management related queries to make data decisions
Job Responsibility
Job Responsibility
  • Collaborate with solution architect, data engineers, business stakeholders, business analysts, and DQ testers to ensure data management and data governance framework is defined as critical components
  • Design and develop data models using industry-standard modeling techniques and tools
  • Perform data profiling, data lineage and analysis to understand data quality, structure, and relationships
  • Optimize data models for performance, scalability, and usability by creating optimal data storage layer
  • Define and enforce data modeling standards, best practices, and guidelines
  • Participate in data governance initiatives to ensure compliance with data management policies and standards
  • Work closely with database administrators and developers to implement data models in relational and non-relational database systems
  • Conduct data model reviews and provide recommendations for improvements
  • Stay updated on emerging trends and technologies in data modeling and data management
  • Fulltime
Read More
Arrow Right

Senior Quality Analyst

The role focuses on quality engineering and automation strategy for the Global F...
Location
Location
India , Pune
Salary
Salary:
Not provided
https://www.citi.com/ Logo
Citi
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 10+ years of experience in functional and non-functional software testing
  • 3+ years of experience as Test Automation Lead
  • Expertise in test automation frameworks / tools like Jenkins, Selenium, Cucumber, TestNG, Junit, Cypress
  • Strong programming skills in Java, Python or any other programming or scripting language
  • Expertise in SQL
  • Experience with API testing tools (Postman, RestAssured) and performance testing tools (JMeter, LoadRunner)
  • Expertise in build tools like Maven / Gradle, continuous integration tools like Jenkins, source management tools like Git/GitHub
  • Strong knowledge of Agile, Scrum, and DevOps practices
  • Strong knowledge of functional Test tool (JIRA)
  • Familiarity with cloud-based test execution – AWS, Azure, or GCP
Job Responsibility
Job Responsibility
  • Plan, lead, and execute testing automation strategy for CitiFTP
  • Continuously monitor automation coverage and enhance the existing automation framework to increase the automation coverage
  • Design, Develop, and Implement scalable and maintainable automation frameworks for UI, API, and data validation testing on Big Data/Hadoop platform
  • Collaborate with other testing areas, development teams, product owners, and business partners to integrate automation into the agile SDLC
  • Enhance the efficiency of regression, and end-to-end testing using automation
  • Develop robust test scripts and maintain automation suites to support rapid software releases
  • Improve overall test coverage, defect detection, and release quality through automation
  • Establish and track key QA metrics e.g. defect leakage, test execution efficiency, automation coverage
  • Advocate for best practices in test automation, including code reviews, re-usability and maintainability
  • Drive the adoption of AI/ML-based testing tools and emerging trends in test automation
What we offer
What we offer
  • Global workforce benefits designed to support well-being, growth, and work-life balance
  • Fulltime
Read More
Arrow Right
New

Azure Data Engineer

The Azure Data Engineer role involves designing and maintaining ETL pipelines us...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
Read More
Arrow Right

Azure Data Engineer

The Azure Data Engineer role requires 5-8 years of experience in designing and m...
Location
Location
India , Chennai
Salary
Salary:
Not provided
nttdata.com Logo
NTT DATA
Expiration Date
Until further notice
Flip Icon
Requirements
Requirements
  • 5–8+ years of experience as a Data Engineer with strong hands‑on expertise in Azure (Data Factory, Databricks, Data Lake Storage, SQL, Synapse preferred)
  • Proven ability to build production‑grade ETL/ELT pipelines supporting complex, multi‑regional business processes
  • Experience designing or implementing rules engines (Drools, ODM, or similar)
  • Strong SQL skills and experience with data modeling, data orchestration, and pipeline optimization
  • Experience working in Agile Scrum teams and collaborating across global regions (U.S. and India preferred)
  • Ability to partner closely with analysts and business stakeholders to translate rules into technical solutions
  • Excellent debugging, optimization, and engineering problem‑solving skills
  • Minimum Skills Required: SQL, Python, Azure Data Factory, Databricks, Azure Synapse
Job Responsibility
Job Responsibility
  • Design, build, and maintain Azure‑based ETL pipelines (e.g., Data Factory, Databricks, Data Lake) to ingest, clean, transform, and aggregate compensation‑related datasets across multiple regions
  • Engineer upstream processes to produce 9–10 monthly aggregated output files (customer, revenue, product, sales rep, etc.), delivered 3× per month
  • Ensure repeatability, monitoring, orchestration, and error‑handling for all ingestion and transformation workflows
  • Contribute to the creation of a master stitched data file to replace Varicent’s current data‑assembly functions
  • Build, configure, and maintain a rules engine (ODM, Drools, or similar) to externalize business logic previously embedded in code
  • Translate rules and logic captured by analysts and business SMEs into scalable, testable engine components
  • Implement versioning, governance, and validation mechanisms for all logic used in compensation calculations
  • Ensure rule changes can be managed safely, reducing risk in high‑stakes compensation scenarios
  • Partner with data architects to implement the target‑state Azure data architecture for compensation analytics
  • Develop optimized, scalable physical data models aligned to business logic and downstream needs
  • Fulltime
Read More
Arrow Right