Posted on: 28/01/2026
Description :
We are seeking a DevOps Engineer with strong experience in cloud infrastructure, automation, and CI/CD, along with hands on exposure to React frontend applications and Python/PySpark-based data or backend workloads. The ideal candidate will support end to end application and data pipeline deployments in a scalable and reliable environment.
Key Responsibilities :
- Design, implement, and maintain CI/CD pipelines for React, Python, and PySpark applications
- Deploy and manage React frontend applications and Python/PySpark backend or data processing jobs
- Automate infrastructure provisioning using Infrastructure as Code (IaC) tools
- Manage and optimize cloud infrastructure (AWS / Azure / GCP)
- Build, deploy, and manage containerized applications using Docker and Kubernetes
- Support data pipelines and batch/stream processing using PySpark
- Ensure system reliability, scalability, performance, and security
- Implement monitoring, logging, and alerting solutions
- Troubleshoot deployment and production issues across environments
- Collaborate with development and data engineering teams
- Apply DevSecOps best practices across pipelines and infrastructure
- Document architecture, deployment processes, and operational runbooks
Required Skills & Qualifications :
- Bachelors or advanced Degree in a related technical field
- 5 to 8+ years of experience in DevOps / Infrastructure Engineering
- Strong experience with cloud platforms (AWS preferred; Azure/GCP acceptable)
- Hands?on experience with CI/CD tools (Jenkins, GitHub Actions, GitLab CI, Azure DevOps)
- Experience with Docker and Kubernetes
- Proficiency in Python for automation and backend support
- Hands?on experience with PySpark for data processing or analytics workloads
- Working knowledge of React application build and deployment processes
- Experience with Infrastructure as Code (Terraform, CloudFormation, ARM)
- Strong scripting skills (Bash, Python)
- Familiarity with monitoring/logging tools (Prometheus, Grafana, ELK, Datadog, etc.)
Preferred Qualifications :
- Experience with big data platforms (EMR, Databricks, Spark on Kubernetes)
- Knowledge of microservices architecture
- Experience with DevSecOps and security scanning tools
- Familiarity with data orchestration tools (Airflow, Step Functions)
- Cloud certifications (AWS / Azure / GCP)
- Experience working in Agile/Scrum environments
Soft Skills :
- Strong communication and collaboration skills
- Ability to work across DevOps, frontend, backend, and data teams
- Ownership mindset and proactive problem's olving attitude
Did you find something suspicious?
Posted by
Posted in
DevOps / SRE
Functional Area
DevOps / Cloud
Job Code
1606472