Posted on: 25/11/2025
Description :
Responsibilities :
- Architect and manage scalable, secure, and cost-efficient cloud platforms using GCP services.
- Design and implement infrastructure using Infrastructure as Code tools using Terraform.
- Drive FinOps practices to improve cloud cost governance - implementing cost tracking, forecasting, budget alerts, and automated cost controls.
- Optimise BigQuery performance through query tuning, partitioning, clustering, and table lifecycle management.
- Define and promote best practices for BigQuery usage and cost-efficient data architecture.
- Set up end-to-end observability using GCP Operations Suite (Cloud Monitoring, Logging, Trace, Profiler, Error Reporting).
- Build custom dashboards using Cloud Monitoring and Grafana; implement SLOs, SLIs, and automated alerts.
- Drive DevSecOps pipelines using CI/CD tools like Cloud Build, GitHub Actions, Jenkins, and integrate quality/security gates.
- Hands-on experience with containerization technologies such as Docker and Kubernetes (GKE).
- Design and implement scalable data pipelines using GCP data services such as Cloud Storage, Pub/Sub, Dataflow, BigQuery, and Cloud Composer.
- Automate ingestion, transformation, and loading (ETL/ELT) processes across structured and unstructured data sources.
- Optimise pipeline performance and cost by leveraging Dataflow templates, streaming strategies, and efficient resource configurations.
- Ensure data pipeline observability through integration with Cloud Logging, Monitoring, and custom alerting.
- Manage and configure services like Cloud Run, GKE, Cloud Functions, Pub/Sub, Cloud SQL, and Dataproc.
- Write automation and scripting in Python, Bash, and YAML for deployments, monitoring, and infrastructure workflows.
- Build, secure, and deploy microservices architectures using API Gateway, service meshes, and IAM roles.
- Collaborate with cross-functional teams, manage client expectations, and deliver in both Agile and Waterfall models.
- Lead incident resolution efforts and post-incident analysis using observability data.
- Maintain secure identity and access management (IAM), VPC, firewall rules, and service accounts across projects.
Requirements :
- GCP Expertise : In-depth knowledge of Google Cloud Platform services, particularly BigQuery, Databricks, and Kubernetes.
- Data Engineering : Experience with data modelling, ETL processes, and data warehousing.
- Performance Tuning : Proven track record in optimising data processing and query performance in BigQuery.
- Kubernetes Management : Proficient in deploying and managing containerised applications using Kubernetes.
- Cost Management : Familiarity with GCP billing and cost management tools to monitor and optimise cloud expenditure.
- Monitoring Tools : Experience with GCP monitoring and logging tools (e. g., Stackdriver, Cloud Monitoring).
- Years of Experience : 12+ years' experience, 5 to 7 years of experience in GCP Solution Architect.
- Education Qualification : B. E. / B. Tech, BCA, MCA equivalent, GCP - Professional Cloud Architect.
Did you find something suspicious?
Posted By
Posted in
DevOps / SRE
Functional Area
Technical / Solution Architect
Job Code
1580463
Interview Questions for you
View All