Job Description
We are seeking a QA Manual Tester with a strong background in Data Engineering and Analytics environments to ensure the accuracy, integrity, and quality of data across complex pipelines, warehouses, and reporting platforms. This role is crucial to validate data ingestion, transformation logic, and end-to-end data flows across platforms such as Snowflake, StreamSets, AWS, and BI tools. The ideal candidate will possess strong SQL skills, analytical thinking, and hands-on testing experience in modern data architectures.
🚨 Before You Apply: Your Resume Needs to Shine!
Did you know? 75% of applications get rejected before reaching a human recruiter – all because of poorly formatted resumes that fail ATS scans!
🔥 Get Interview-Ready in Minutes with Our Professionally Designed Resume Templates!
✅ 5+ ATS-Friendly Designs – Beat the bots and get noticed
✅ Recruiter-Approved Layouts – Highlight your skills the right way
✅ Easy-to-Edit (Word & Google Docs) – No design skills needed
✅ Free Bonus: Cover Letter Template + Resume Writing Guide
🎁 Limited-Time Offer: Get yours for just ₹249 (originally ₹999)
📥 Instant Download – Apply to Google with confidence today!
👉 Grab Your Resume Template Now: Tap Here to get your resume Templates
Responsibilities
Perform end-to-end testing of data pipelines, including validation of data ingestion, transformation, and loading stages.
Review data mappings and transformation logic against source-to-target documents.
Collaborate with data engineers to understand technical designs and ensure testability of data workflows.
Prepare comprehensive test plans, test scenarios, and test cases for structured and semi-structured data processing.
Perform data validation using complex SQL queries across staging, raw, and curated zones.
Identify, document, and track defects related to data discrepancies, job failures, and performance issues.
Validate data accuracy, completeness, duplicates, null handling, and boundary conditions.
Perform reconciliation testing between source systems (e.g., RDBMS, APIs, Flat Files) and the target platform (e.g., Snowflake, Redshift).
Use tools such as Excel, SQL clients, Postman, or REST clients for test execution and result analysis.
Work closely with data engineers, business analysts, and DevOps teams to support pipeline testing and releases.
Participate actively in Agile ceremonies – sprint planning, daily standups, retrospectives, and backlog grooming.
Work with cross-functional teams including Data Engineers, Platform Admins, DevOps, and QA to ensure successful sprint outcomes.