HamburgerMenu
hirist

Job Description

Responsibilities :


- Design, develop, and implement highly scalable and reliable automation frameworks and test solutions primarily using Python for applications deployed on the AWS cloud platform.


- Lead the testing efforts for AI agents and chatbots, including designing test strategies, developing automated test scripts, and analyzing results to ensure accuracy, performance, and user satisfaction.


- Gain expertise and contribute to the testing of Large Language Models (LLMs), including prompt engineering validation, model output verification, and assessing aspects like coherence, factual accuracy, bias, and toxicity.


- Develop and maintain robust CI/CD pipelines to automate testing, deployment, and monitoring processes within the AWS ecosystem.


- Collaborate closely with development, product, and AI/ML teams to understand system architecture, identify testing gaps, and implement effective automation strategies.


- Perform in-depth analysis of test failures, troubleshoot complex issues, and work with development teams to identify root causes and ensure timely resolution.


- Implement and advocate for best practices in test automation, code quality, and software engineering processes.


- Design and manage automated testing for various components including APIs (RESTful, GraphQL), microservices, and front-end applications.


- Monitor and report on test automation progress and quality metrics, providing actionable insights to stakeholders.


- Stay updated with the latest advancements in test automation, AWS services, AI/ML testing methodologies, and Python libraries, recommending and implementing new tools and techniques to enhance efficiency and coverage.


- Mentor and guide junior automation engineers, fostering a culture of technical excellence and continuous improvement within the team.


Technical Skills :


Programming Languages : Strong proficiency in Python (5+ years hands-on experience), including experience with relevant testing frameworks (e.g., Pytest, Unittest) and libraries.


Cloud Platform : Hands-on experience with AWS services (e.g., EC2, Lambda, S3, SQS, SNS, DynamoDB, CloudWatch, API Gateway, Step Functions, SageMaker).


Automation Frameworks : Demonstrated experience in designing, developing, and maintaining scalable test automation frameworks from scratch using Python.


Testing AI Agents & Chatbots :


- Experience in testing conversational AI systems, understanding intent recognition, entity extraction, dialogue flow, and contextual understanding.


- Familiarity with tools and techniques for evaluating chatbot performance and user experience.


Large Language Models (LLMs) Testing (Preferred) :


- Understanding of LLM evaluation metrics (e.g., perplexity, coherence, factual accuracy, toxicity, bias).


- Experience with techniques for prompt engineering validation and model output verification.


- Familiarity with frameworks or approaches for testing LLM-powered applications.


CI/CD Tools : Proficient with CI/CD tools like Jenkins, GitLab CI/CD, AWS CodePipeline/CodeBuild/CodeDeploy for orchestrating automated tests and deployments.


Version Control : Expert-level proficiency with Git (GitHub, GitLab, Bitbucket).


API Testing : Extensive experience with API testing tools and frameworks (e.g., Postman, Pytest-Requests, REST Assured).


Containerization : Working knowledge of Docker for creating consistent test environments.


Database Testing : Experience with testing relational and NoSQL databases.


Operating Systems : Proficiency in Linux/Unix environments.


Agile Methodologies : Experience working in an Agile/Scrum development environment.


info-icon

Did you find something suspicious?