HamburgerMenu
hirist

Data Architect - Google Cloud Platform

SMARTWORK IT SERVICES
Multiple Locations
12 - 15 Years

Posted on: 25/11/2025

Job Description

Description :

Job Title : GCP Data Architect.

Experience : 12+ Years.

Work Location : Hyderabad / Chennai.

Job Summary :

We are looking for a highly experienced GCP Data Architect with deep expertise in Data Engineering, ETL/ELT development, and Enterprise Data Warehousing.

The candidate should have strong hands-on experience in GCP cloud technologies, Airflow, Python, and Teradata.

You will be responsible for architecting scalable cloud data platforms, modernizing legacy systems, and providing technical leadership across end-to-end data solutions.

Roles & Responsibilities :

Cloud Data Architecture & Engineering :

- Architect, design, and deliver scalable Data Engineering and Data Warehousing solutions on Google Cloud Platform (GCP).

- Lead end-to-end implementation using BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, Cloud Run, and other GCP services.

- Modernize legacy systems by integrating traditional platforms like Teradata into GCP environments.

ETL/ELT & Workflow Management :

- Design and optimize ETL/ELT pipelines ensuring high performance, reliability, and automation.

- Build, schedule, and manage workflows using Apache Airflow with strong hands-on Python scripting.

Data Modeling & Optimization :

- Perform conceptual, logical, and physical data modeling for analytics and reporting.

- Write and optimize complex SQL queries for large datasets in BigQuery and Teradata.

- Engineer data solutions for handling semi-structured formats (JSON, Parquet, XML).

Governance, Quality & Security :

- Define, implement, and enforce standards for data quality, metadata management, governance, and security.

- Ensure compliance with organizational data policies and best practices.

Automation, DevOps & Agile Delivery :

- Implement CI/CD pipelines and collaborate with DevOps teams using GitHub, Jenkins, and cloud-native tools.

- Work in Agile/SAFe environments, participating in sprint planning, estimations, and team alignment.

- Document architecture patterns, data flows, best practices, and solution designs.

Leadership & Collaboration :

- Provide technical leadership and guidance to development teams.

- Collaborate with business, analytics, and engineering stakeholders to align technical solutions with business goals.

- Evaluate emerging GCP capabilities and recommend improvements for efficiency and scalability.

Required Qualifications :

- 12+ years of overall IT experience with strong ETL, Data Warehousing, and Data Engineering background.

- 7+ years hands-on experience with GCP cloud data services.

- Proven experience delivering 2+ large-scale GCP Data Warehousing projects.

Strong expertise in :

- BigQuery, Cloud Storage, Dataflow, Pub/Sub.

- Airflow, Python, Cloud Functions, Cloud Run.

- Teradata (including performance tuning & workload migration).

- SQL, PySpark, and distributed systems optimization.

- Deep understanding of data modeling, profiling, mapping, and validation.

- Strong communication, leadership, and analytical problem-solving skills.

Preferred Skills :

- Experience with CI/CD, GitHub, Jenkins, JIRA, Confluence.

- Knowledge of Kubernetes, Docker, or containerized workloads.

- Domain experience in Financial, Telecom, or Retail industries.


info-icon

Did you find something suspicious?