HamburgerMenu
hirist

Data Engineer - Google Cloud Platform

Posted on: 29/07/2025

Job Description

Key Responsibilities :

- Design, develop, and optimize scalable data pipelines and ETL/ELT workflows using GCP-native tools such as BigQuery, Dataflow, Cloud Composer, Pub/Sub, and Cloud Storage.

- Build and maintain semantic data models in Looker (LookML), ensuring accuracy, usability, and performance for self-service analytics.

- Collaborate with analysts, data scientists, and business stakeholders to translate business requirements into reusable data assets and Looker dashboards.

- Lead the design of data lake/data warehouse architectures and enforce modelling best practices.

- Follow and implement data quality frameworks, lineage tracking, and data governance policies across the platform based on the defined standard.

- Optimize BigQuery performance and cost, including partitioning, clustering, and query tuning.

- Develop and maintain CI/CD pipelines for data workflows and LookML repositories.

- Ensure data security, privacy, and access control compliance on GCP and Looker.

- Mentor junior engineers and contribute to the growth of engineering practices within the team.

- Deliver code according to the specifications; follow code standards, versioning and branching

- Interact with tech-leads, architects to ensure good design and code quality

- Ensure documentations are up to date (technical design, deployment guide, release notes, etc.)

ROLE REQUIREMENTS :

- 8+ years in data engineering, including 3+ years on GCP and 1+ year on Looker.

- Experience and knowledge of SDLC or Agile development framework and methodologies

- Strong experience on data warehousing design, data modelling and data lakes

KNOWLEDGE AND SKILLS

Qualifications/Education Required :


Degree holder in Computer Science or related discipline / relevant experience with MSBI technology

Experience Required :

- A total of 8+ years in data engineering, including 3+ years on GCP and 1+ year on Looker.

Competencies Required :

- Good communicator

- Team player and result oriented

- Collaborative learner

- Problem-solving

Skills & Knowledge Requirements


- Strong hands-on experience with GCP data services, especially: BigQuery, Dataflow, Cloud Composer (Airflow), Pub/Sub, Cloud Functions, Cloud Storage

- Proficient in SQL and Python for data transformation and automation.

- Solid understanding of data warehousing, dimensional modeling, data lakes, and modern data architecture.

- Experience building streaming and batch pipelines at scale.

- Strong command over LookML (semantic modeling)

- Ability to design performant explores, dashboards, and governed self-service data layers

- Experience with version control (e.g., Git) in Looker projects

- Experience with Airflow DAGs, or other orchestration tools.

Nice to have (Optional) :

- Experience on Kubernetes, PySpark, development

- Understanding of RBAC, IAM, and data security on cloud platforms

- GCP Professional Data Engineer or Looker Certification is highly desirable.

- Knowledge of Apache Spark, Kafka, or other cloud-native processing tools is advantageous.


info-icon

Did you find something suspicious?