Posted on: 11/02/2026
Description :
Key Responsibilities :
- Build and maintain ETL processes for data transformation and integration
- Work with Big Data technologies to process large datasets efficiently
- Develop data solutions using Python and SQL
- Implement and optimize data warehousing solutions
- Utilize Apache Spark for distributed data processing
- Collaborate with BI teams to enable reporting and analytics
- Ensure data quality, performance, and reliability of systems
- Work in an Agile environment and collaborate with cross-functional teams
Required Skills :
- 6 - 9 years of experience in Data Engineering
- Strong hands-on experience with Google Cloud Platform (GCP)
- Proficiency in Python and SQL
- Experience with ETL development and data warehousing concepts
- Hands-on experience with Apache Spark
- Good understanding of Big Data ecosystems
- Exposure to Business Intelligence (BI) tools and reporting systems
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1611758