Posted on: 05/02/2026
Job Summary :
We are looking for an experienced GCP Data Engineer to design, build, and own scalable data platforms on Google Cloud. The ideal candidate will play an architectural role, driving end-to-end data solutions, defining best practices, and mentoring team members. You will work closely with stakeholders, analysts, and data scientists to deliver reliable, high-performance data pipelines and analytics platforms.
Qualifications :
- Strong hands-on experience with GCP services such as :
a. BigQuery Cloud Storage,
b. Dataflow (Apache Beam)
c. Pub/Sub
d. Cloud Composer (Airflow)
e. Cloud Functions / Cloud Run
- Experience designing batch and streaming data pipelines
- Expertise in data warehousing and analytics architectures
- Advanced proficiency in Python (data processing, orchestration, APIs, automation)
- Strong command of SQL (complex queries, performance tuning, analytics use cases)
- Experience defining data platform architecture, patterns, and best practices
- Strong understanding of data modeling, partitioning, clustering, and optimization
- Ability to translate business requirements into technical designs
Why join us?
You'll have the opportunity to collaborate on multiple global projects, essentially gaining experience across multiple technologies simultaneously
More reasons to join us :
- 4.4 Glassdoor Rating
- Fully remote work environment
- Exposure to cutting-edge technologies and international clients spanning various industries
- Opportunities to engage in diverse projects and technologies, with cross-domain training and support for career or domain transitions, including certifications reimbursement
- Profitable and bootstrapped company
- Flexible working hours with a 5-day workweek
- Over 30 paid leaves annually
- Merit-based compensation with above-average annual increments
- Sponsored team luncheons, festive celebrations, and semi-annual retreats
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1609918