HamburgerMenu
hirist

Job Description

Location : Hyderabad, India


Experience : 5+ Years


Employment Type : Full Time


Role Overview :


Build, scale, and operate reliable cloud-based data platforms. Enable faster delivery of high-quality data products through automation and DataOps practices. Work at the intersection of Data Engineering and DevOps to improve platform reliability and efficiency.


Key Responsibilities :


- Design, build, and maintain scalable data infrastructure on GCP and AWS.


- Develop and manage Infrastructure-as-Code using Terraform.


- Build and maintain CI/CD pipelines for data platforms and workloads.


- Support deployment of data pipelines, dbt models, and orchestration workflows.


- Implement monitoring, logging, and alerting for data pipelines and infrastructure.


- Ensure security, compliance, and governance across data platforms.


- Optimize cloud resource utilization and data platform costs.


- Collaborate with data engineering, analytics, and data science teams.


- Maintain documentation, runbooks, and operational processes.


- Drive continuous improvements in platform reliability and developer experience.


Key Result Areas :


- High availability and stability of data pipelines and platforms.


- Reduced deployment failures and faster release cycles.


- Improved observability and proactive incident detection.


- Automation-driven reduction in manual operational effort. Secure and compliant data infrastructure.


- Optimized cloud costs and improved cost visibility.


- Improved productivity of data engineering and analytics teams.


Required Skillsets :


- Strong experience with Google Cloud Platform services.


- Hands-on expertise with BigQuery, Cloud Storage, Pub/Sub, Dataflow, and Cloud Composer.


- Experience with AWS data services such as Glue, Redshift, or Lambda.


- Proficiency in Terraform for multi-cloud infrastructure management.


- Experience with CI/CD tools such as GitHub Actions and Jenkins.


- Working knowledge of data platforms like BigQuery, Snowflake, or Databricks.


- Experience with monitoring and observability tools such as Prometheus and Grafana.


- Strong scripting skills in Python and Bash.


- Experience with Docker and Kubernetes or GKE.


- Familiarity with dbt and workflow orchestration tools like Airflow.


- Understanding of data security, governance, and compliance best practices.


- Experience working in Agile development environments.


Qualifications :


- 5+ years of experience in Data Engineering, DataOps, DevOps, or Platform Engineering roles.


- Experience supporting enterprise-scale data platforms preferred.

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in