Posted on: 06/08/2025
Role and Responsibilities :
As an Enterprise Architect, you will :
- Lead the design and development of end-to-end data architectures on the Google Cloud Platform, focusing on scalability, reliability, and cost-effectiveness.
- Architect and build data pipelines and workflows using GCP services such as BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Composer, and Dataform.
- Work closely with stakeholders to understand data requirements and translate them into technical solutions.
- Provide technical leadership and guidance to data engineering teams, ensuring best practices for data governance, security, and quality are followed.
- Define and enforce architectural standards, patterns, and principles for data solutions.
- Oversee the implementation of CI/CD pipelines for data workflows using tools like Jenkins, GitHub, or Bitbucket.
- Utilize Infrastructure as Code (IaC) tools like Terraform or Ansible to manage and automate infrastructure provisioning.
- Stay informed about new GCP services and data technologies, recommending and implementing innovations to enhance our data platform.
Qualifications :
Education : Bachelor's or master's degree in Computer Science, Information Technology, Engineering, or a related field.
Experience :
- Minimum of 10+ years of experience in data engineering, with at least 4 years of hands-on experience on GCP.
- Proven experience in designing and implementing complex data workflows and architectures on a cloud platform.
Certifications :
- Google Cloud Professional Data Engineer certification is strongly preferred.
Required Technical Skills :
- Advanced proficiency in Python for developing data pipelines, automation, and scripting.
- Strong SQL skills for advanced querying, data transformation, and analysis.
- Deep, hands-on experience with core GCP services, including :
- Data Services : BigQuery, Dataform, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud SQL.
- Compute & Containers : Compute Engine, Kubernetes Engine (GKE).
- Experience with CI/CD tools (e.g., Jenkins, GitHub, Bitbucket).
- Proficiency in Docker and Kubernetes for containerization and orchestration.
- Solid experience with Infrastructure as Code (IaC) tools like Terraform or Ansible.
- Familiarity with workflow orchestration tools such as Apache Airflow or Cloud Composer.
- Strong understanding of Agile/Scrum methodologies.
Nice-to-Have Skills :
- Experience with other cloud platforms (AWS, Azure).
- Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau).
- Understanding of machine learning workflows and their integration with data pipelines.
Soft Skills :
- Strong problem-solving and critical-thinking abilities to tackle complex architectural challenges.
- Excellent communication skills to effectively collaborate with both technical teams and non-technical business stakeholders.
- A proactive, innovative, and continuous learning mindset.
- Ability to work independently while also excelling in a collaborative team environment.
Did you find something suspicious?
Posted By
Posted in
DevOps / SRE
Functional Area
Data Engineering
Job Code
1525390
Interview Questions for you
View All