Posted on: 01/04/2026
Job Title : DevOps Engineer | Azure + multi-cloud
Location : Indore
Employment Type : Full-time
About the Role :
We are hiring a Data DevOps Engineer with 4+ years of experience in building and supporting infrastructure and data platforms across Azure and at least one other major cloud (AWS or GCP). The ideal candidate has strong expertise in Terraform, Kubernetes, CI/CD, cloud networking, and managing data pipelines and monitoring systems.
Responsibilities :
Infrastructure & Automation :
- Design and manage cloud infrastructure using Terraform and Kubernetes (AKS/EKS/GKE).
- Deploy and manage WAFs, Load Balancers, Firewalls, and Virtual Networks/Subnets.
- Implement CI/CD pipelines using Azure DevOps, GitHub Actions, or Bitbucket.
- Automate backup, recovery, scaling, and failover strategies across environments.
- Manage identity, access, and policy compliance using Azure AD, IAM, and Intune/MDM.
Data Platform Operations :
- Collaborate with data teams to support Azure Data Factory, Databricks, Synapse, or equivalent.
- Maintain and monitor ETL workflows, streaming pipelines, and data storage across Blob, Data Lake, SQL, MongoDB, Redis, etc.
- Troubleshoot and optimize data flows and infrastructure performance.
Monitoring & Alerting :
- Implement monitoring with Azure Monitor, Log Analytics, Prometheus, Grafana, or ELK.
- Configure dashboards and alerting policies to detect and resolve infra/data issues proactively.
- Ensure logs, metrics, and alerts are integrated into incident response workflows.
Required Skills :
- 3+ years of hands-on experience in DevOps + Cloud Infrastructure.
- Strong proficiency in Terraform and Kubernetes (AKS, EKS, or GKE).
- Solid working knowledge of Azure + one more cloud (AWS or GCP).
- Experience managing WAFs, Load Balancers, Firewalls, and VPC/VNet setups.
- Hands-on with CI/CD tools : Azure DevOps, GitHub Actions, Bitbucket.
- Experience supporting data tools : Azure Data Factory, Databricks, or equivalent.
- Familiar with MongoDB, MySQL, Redis, and cloud storage systems.
- Good understanding of cloud networking, IAM, MDM, and basic compliance/security practices.
Nice to Have :
- Scripting skills in PowerShell, Bash, or Python.
- Knowledge of Azure Synapse, BigQuery, S3, or Kafka/EventHub.
- Exposure to cost governance and FinOps best practices.
Did you find something suspicious?
Posted by
Posted in
DevOps / SRE
Functional Area
DevOps / Cloud
Job Code
1625270