Posted on: 18/08/2025
Key Responsibilities :
- Validate and test Big Data ecosystems including Hadoop, Kafka, SOLR, and related ETL components.
- Design and execute ETL test cases, validating data extraction, transformation logic, and data loading processes.
- Perform end-to-end debugging across multi-node architectures to identify and resolve issues.
- Write and maintain functional and non-functional test cases and automation scripts using Python.
- Analyze and troubleshoot Linux OS environments with focus on filesystem, network stack, and application-level issues.
- Work with Layer 2/3 networking protocols and validate end-to-end data flow.
- Prepare and execute User Acceptance Testing (UAT) plans and certification.
- Provide deployment support and assist in creating POC setups.
- Collaborate with product, PS (Professional Services), and development teams to groom requirements and optimize testing workflows.
Required Skills & Qualifications :
- Strong knowledge of Big Data technologies : Hadoop, Kafka, SOLR, etc.
- Proficiency in Python for test automation and scripting.
- Solid understanding of ETL pipelines, concepts, debugging, and validation.
- Hands-on expertise in Linux OS for debugging filesystems, processes, and networks.
- Basic understanding of networking protocols, especially Layer 2/3.
- Experience working with any RDBMS or NoSQL databases.
- Familiarity with SDLC and STLC, including documentation and reporting practices.
- Strong problem-solving skills and attention to detail.
Good to Have :
- Experience with automation testing frameworks and tools.
- Hands-on with API testing using Postman, SoapUI, REST-assured, etc.
- Exposure to CI/CD pipelines, Jenkins, or other deployment tools.
- Prior experience in security, telecom, or communication analytics domains is a plus.
Did you find something suspicious?
Posted By
Posted in
Quality Assurance
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1531613
Interview Questions for you
View All