Posted on: 09/04/2026
Description : We are looking for passionate Data Engineers who thrive in building scalable data platforms using modern cloud and big data technologies.
Role : Data Engineer
Key Responsibilities :
- Design and develop scalable data pipelines on GCP
- Build end-to-end data ingestion and processing systems from :
1. Traditional databases
2. API-based data sources
- Work on large-scale distributed systems and big data platforms
- Collaborate with cross-functional teams to support BI and Analytics use cases
- Ensure data quality, reliability, and performance optimization
- Contribute to architecture and design of data platforms
Required Skills & Experience :
- Strong experience in GCP-based data engineering
- Hands-on experience with Hadoop ecosystem (preferred if GCP exposure is limited)
- Proficiency in programming languages : Python, Scala, Java
- Strong understanding of data storage formats and their trade-offs
- Experience in building end-to-end data pipelines
- Good analytical and problem-solving skills
- Strong communication and stakeholder management abilities
- Data Engineering certification is a plus
GCP Tools & Technologies :
Storage :
- Cloud SQL, Cloud Storage, BigQuery
- Cloud Bigtable, Cloud Spanner, Datastore
Data Ingestion :
- Pub/Sub, Kafka
- App Engine, Kubernetes Engine
- Stackdriver, Dataprep, Microservices
Scheduling :
- Cloud Composer
Processing :
- Cloud Dataproc, Dataflow, Dataprep
CI/CD & Tools :
- GitLab / Bitbucket + Jenkins
- Atlassian Suite
Good to Have :
- Experience in Retail or eCommerce domain
- Strong interest in Data Engineering as a career path
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1627350