- Strong expertise in Google Cloud Platform (GCP) architecture and data services, with proven experience designing and deploying large-scale enterprise data solutions.
- Experience and learnings using GCP core services such as Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, and Cloud Functions.
- Deep expertise in BigQuery, including data modeling, schema design, performance tuning, and building data pipelines for large-scale analytics.
- Hands-on experience deploying and managing scalable data solutions in production on GCP.
- Proficiency in Infrastructure as Code (IaC) languages, such as Terraform (Must), with a focus on creating reusable and modular configurations for data infrastructure.
- Deep understanding of cloud-native data concepts, including data warehousing, data lakes, ETL/ELT processes, and building analytical systems.
- Experience with CI/CD pipelines and GitOps for automated build, test, and deployment of data workflows on GCP.
- Strong problem-solving skills and the ability to translate complex business requirements into practical and scalable data solutions.
- Strong communication and collaboration skills to work effectively in interdisciplinary teams.
Preferred Additional Skills :
- Google Cloud Professional Data Engineer or Architect certification.
- Publications or open-source contributions in data architecture, data engineering, or related fields.
- Familiarity with emerging AI platforms and frameworks, such as Agent Space.