Posted on: 25/09/2025
About The Company :
The Role : Big Data Developer- Hadoop
Key Responsibilities :
- Work with Hadoop, Hive, and Spark for large-scale data processing.
- Implement data solutions using GCP (Google Cloud Platform) services.
- Collaborate with teams to design and deploy scalable data architectures.
- Troubleshoot and optimize performance across various data platforms.
- Ensure data integrity and efficient processing across systems.
- Mentor junior developers and contribute to team knowledge sharing.
Skills Required :
- Hands-on experience with Hadoop / Hive / Spark.
- Proficiency in complex SQL query analysis and writing.
- A minimum of 5 years of experience in development.
- Strong knowledge of GCP
Qualifications & Experience :
- Bachelors Degree in Computer Science or IT or Engineering or related field
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1551337
Interview Questions for you
View All