Posted on: 11/12/2025
Description :
About the job
Experience 5+ years
Location - Hybrid
Job Description :
- Extensive experience with Google Cloud Platform (GCP) and its data services, including
- BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer.
- Proven track record of designing and building scalable data pipelines and architectures.
- Experience with ETL tools and processes.
- Design, develop, and maintain robust and scalable data pipelines using GCP services such as Dataflow, Pub/Sub, Cloud Functions, and Cloud Composer.
- Implement ETL (Extract, Transform, Load) processes to ingest data from various sources into GCP data warehouses like BigQuery.
Technical Skills :
- Proficiency in SQL and experience with database design and optimization.
- Strong programming skills in Python, Java, or other relevant languages.
- Experience with data modeling, data warehousing, and big data processing frameworks.
- Familiarity with data visualization tools (e.g., Looker, Data Studio) is a plus.
- Knowledge of machine learning workflows and tools is an advantage.
Soft Skills :
- Strong analytical and problem-solving skills.
- Excellent communication and collaboration abilities.
- Ability to work in a fast-paced environment and manage multiple tasks concurrently.
- Leadership skills and experience mentoring junior engineers
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1588325