Posted on: 25/03/2026
Description :
Title : Bigdata Scala Spark Developer - Hadoop
Exp : 4 to 8 years
Location : Pune, Hyderabad
NP : Immediate to 30 Days
Job Description :
- 4 years of experience in Scala Spark Big Data Hadoop Hive
- Must have good technical experience and should be able to provide technical solutions for multiple modules in parallel on need basis and bring the task to closure on time
- At least 5 years of development work experience in Hadoop programming HDFS using Scala based Data warehouse projects and with good Shell Scripting experience
- Responsible for gathering requirements and designing solutions to ingest the data into the Bigdata platform on to HIVE
- Should be able to work independently
- Prior Experience to any Databases like Oracle Netezza ETL will be added advantage
- Nice to Have DQ Experience Autosys Jenkin RLM etc
- Proactive and have good communication skills to articulate technical issues
- Exposure to Confluence JIRA
- Excellent communication and documentation skill Very good at team playing and flexibility to work in different time zones based on project needs
- Certification Mandate Spark or Hadoop Developer certification.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1623320