Posted on: 18/12/2025
Description :
Job Title : Data Solutions Architect - Google Cloud Platform (GCP)
Location : Noida/ Bangalore
Job Type : Full Time
Job Description :
We are seeking a highly experienced and strategic Data Solutions Architect - Google Cloud Platform (GCP) to join our technology team. This pivotal role will be responsible for leading the design, architecture, and implementation of highly scalable, secure, and performant data solutions on GCP. The ideal candidate will possess deep expertise in GCP's comprehensive suite of data services, modern cloud data architecture patterns, and best practices in data engineering. You will be instrumental in translating complex business requirements into robust technical solutions, collaborating extensively with cross-functional teams to drive innovation, ensure data reliability, and support advanced analytics and AI initiatives that deliver significant business value.
Your Role and Responsibilities :
- Lead Data Architecture on GCP : Architect and design end-to-end data solutions leveraging a wide array of GCP services including BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Cloud Composer, and Data Catalog.
- Data Platform Development & Optimization : Design, build, and maintain scalable, robust, and secure data platforms for ingestion, processing, storage, and analytics of all data types. Continuously optimize data warehouse and lakehouse performance through advanced schema design, partitioning, clustering, and query optimization techniques in BigQuery.
- Solution Design & Implementation : Translate complex business requirements into clear technical architectures, detailed data models, and implementation plans, ensuring alignment with organizational standards and best practices.
- Integration & ETL/ELT Pipelines : Develop, optimize, and orchestrate robust data pipelines using GCP-native tools like Dataflow, Dataproc, Dataprep, and Cloud Composer (Apache Airflow), along with integrating third-party ETL/ELT solutions.
- Data Governance & Security : Define, implement, and enforce comprehensive data governance, security, and compliance frameworks across all GCP data assets, utilizing services such as IAM, encryption, Cloud DLP, VPC Service Controls, and audit logging.
- Cloud Migration Leadership : Lead and support the migration of legacy data platforms to GCP, ensuring seamless transition, minimal disruption, and adherence to enterprise standards.
- Technical Leadership & Mentorship : Provide strong technical leadership, guidance, and mentorship to data engineering teams, fostering adoption of best practices for data solution design, development, and deployment on GCP.
- Innovation & Best Practices : Stay at the forefront of the latest GCP data technologies and industry trends. Champion and implement innovative solutions, architectural patterns, and best practices for cloud data architecture and engineering.
Preferred Technical and Professional Experience :
- Education : Bachelors or Masters degree in Computer Science, Engineering, Information Systems, or a related technical field (or equivalent practical experience).
Experience :
- 8+ years of extensive experience in data engineering, data architecture, or analytics.
- At least 3 years in a dedicated data architect or solutions architect role.
- Minimum of 5 years of hands-on experience designing and implementing enterprise-scale data solutions specifically on Google Cloud Platform.
GCP Expertise : Proven expert-level proficiency with core and advanced GCP data services, including but not limited to :
- BigQuery (advanced SQL, optimization, data modeling, partitioning, clustering)
- Cloud Storage (data lake design, lifecycle management)
- Dataflow (Apache Beam for batch and streaming processing)
- Dataproc (managed Apache Spark/Hadoop)
- Pub/Sub (real-time messaging)
- Cloud Composer (Apache Airflow for workflow orchestration)
- Data Catalog (metadata management, data discovery)
- Cloud Functions / Cloud Run (serverless compute for data processing)
- Looker (BI integration)
Technical Proficiency :
- Exceptional proficiency in SQL (with a focus on BigQuery optimization) and Python (including libraries like Pandas, NumPy, etc.).
- Extensive experience with data modeling techniques (dimensional, Kimball, Inmon) and designing data warehousing/data lakehouse architectures.
- Hands-on experience with ETL/ELT tools, orchestration frameworks, and API-driven data integration.
- Proficiency with Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager for provisioning GCP resources.
- Familiarity with event-driven architectures and messaging systems (e.g., Kafka).
- Understanding of containerization technologies (Docker, Kubernetes, GKE) and CI/CD pipelines (e.g., Cloud Build, Cloud Deploy) for data workloads.
- Exposure to NoSQL databases (e.g., Firestore, Bigtable, MongoDB) and various file formats (JSON, Avro, Parquet).
- Knowledge of machine learning workflows and MLOps practices on GCP (e.g., Vertex AI) is a plus.
- Certifications (Highly Preferred) : Google Cloud Professional Data Engineer and/or Google Cloud Professional Cloud Architect.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1592502
Interview Questions for you
View All