HamburgerMenu
hirist

DevOps Engineer - IAC Terraform

Posted on: 10/12/2025

Job Description

Description :

About the Company :

Platform 3 Solutions is a global technology company helping enterprises modernize, decommission, and archive legacy applications and data. Through our flagship brand Archon, we enable organizations to securely retain business-critical information while reducing IT costs, simplifying compliance, and accelerating digital transformation. Our solutions empower leading enterprises across industries to make data accessible, auditable, and future-ready.

What We Do :

Our Archon Suite offers advanced capabilities in data archival, extraction, and analytics, supporting enterprises in managing large-scale system decommissioning and regulatory retention. With a focus on innovation, performance, and scalability, we help businesses bridge the gap between legacy systems and modern cloud environments seamlessly.

We're looking for a hands-on Google Cloud Platform consultant to design, build, and operate scalable data platforms on GCP.


The right candidate will be fluent with GCP data services (Dataproc, Cloud Storage, BigQuery, Data Catalogue, IAP, Cloud Asset Inventory), able to containerise and deploy workloads (Kubernetes/GKE preferred), and comfortable translating existing AWS deployments into equivalent, secure, cost-effective GCP architectures. DevOps experience (CI/CD, Terraform, Docker) and production support skills are a strong plus.

Responsibilities :

- Architect, deploy, and operate data processing and analytics solutions on GCP (Dataproc, BigQuery, Cloud Storage, Data Catalogue, Dataflow/Pub-Sub, where appropriate).

- Translate/map existing AWS data/analytics deployments to GCP plan migrations, identify gaps, estimate costs, and provide runbooks.

- Build and maintain containerised workloads and orchestrate them on GKE; help teams adopt Kubernetes best practices.

- Implement IAM, IAP, and Cloud Asset Inventory to secure resources and manage access/control.

- Implement Infrastructure-as-Code (Terraform/Deployment Manager) for repeatable provisioning.

- Develop CI/CD pipelines for data applications (Cloud Build, Jenkins, GitHub Actions, or similar).

- Optimise performance and costs for BigQuery, Dataproc clusters, and storage architecture.

- Tune Spark jobs and manage Dataproc cluster configurations.

- Implement monitoring, logging, alerting, and operational dashboards on Cloud Monitoring and Logging.

- Troubleshoot production incidents, perform root cause analysis, and drive corrective actions.

- Document architectures, runbooks, operational runbooks, and migration playbooks.

- Coach/partner with application and AWS teams to ensure parity and knowledge transfer.

Requirements :

- Strong hands-on experience with GCP services: Dataproc, Cloud Storage, BigQuery, Data Catalogue, IAM, IAP, Cloud Asset Inventory.

- Proficiency in Docker and Kubernetes with practical knowledge of deployments, services, scaling, and resource management.

- Understanding of Spark, ETL design, and Parquet-based processing.

- Strong networking and security understanding (VPC, IAM, firewall rules, service accounts).

- Practical exposure to DevOps processes, CI/CD, and automation workflows.

- Experience with Infrastructure-as-Code using Terraform.

- Strong ability to interpret AWS deployments and convert them to GCP equivalents.

- Experience with Cloud Monitoring, Logging, and building alerting systems.

- Effective communication and documentation skills.

- Experience with Dataflow, Pub/Sub, Composer, or Cloud Functions.

- Understanding of hybrid and multi-cloud architectures.

- Familiarity with encryption, DLP, and compliance frameworks.

- Knowledge of cost management strategies across both AWS and GCP.


info-icon

Did you find something suspicious?