Posted on: 31/08/2025
Job Description :
- Design, develop and implement data ingestion pipelines to collect, process, and store large and complex datasets using GCP technologies.
- Design and implement data platforms using GCP services such as Cloud Storage, BigQuery, and Dataflow.
- Develop data pipelines to ingest, transform, and load data from various sources into GCP.
- Optimize data processing performance and efficiency using GCP tools and technologies.
- Collaborate with data scientists and analysts to design and implement data models and algorithms for analysis and machine learning.
- Ensure the security, availability, and reliability of Cloud Data Platform systems.
Required Qualifications :
- Bachelor's Engineering Degree in Computer Science, Information Systems, or relevant discipline.
- 8 to 15 years of relevant technology experience.
- Deep experience in GCP cloud data development.
- Experience with GCP Cloud Data technologies such as Cloud Storage, BigQuery, Dataflow, GCP tools, Spark, and Kafka.
- Proficiency in programming languages such as Scala, Java or Python
- Experience with distributed computing and database systems.
- Experience with troubleshooting data issues and identifying root causes of performance bottlenecks.
- Aligned with the latest trends and technologies in data engineering and GCP.
- Strong problem-solving and analytical skills.
- Excellent verbal & written communication and collaboration skills.
- Customer-focused, react well to changes, work with teams and be able to multitask on multiple products and projects.
Did you find something suspicious?
Posted By
Posted in
Data Analytics & BI
Functional Area
Technical / Solution Architect
Job Code
1538433
Interview Questions for you
View All