HamburgerMenu
hirist

Hadoop Developer - Big Data/Spark

Recruitment Hub 365
Vijayawada
3 - 10 Years

Posted on: 22/12/2025

Job Description

Description :

Job Title : Big Data Hadoop Developer

Work mode : Monday- Friday ( 5 days WFO)

Experience : From 2 Years to up to 10 Years

Notice Period : Immediate Joiner

Must-Have Experience : Big Data, Spark, Hive and (Hadoop Ecosystem HDFS, YARN, MapReduce)

About the Role : We are looking for an experienced Big Data Hadoop Developer to design and build scalable big data systems and pipelines. The ideal candidate will have deep expertise in Hadoop, Spark, Hive, and ETL development, along with strong problem-solving skills and the ability to work in a fast-paced environment.

Roles & Responsibilities :

- Design, develop, and maintain scalable big data systems and data pipelines.

- Implement data processing frameworks and optimize large datasets using Hadoop, Spark, and Hive.

- Develop and maintain ETL processes ensuring high data availability, accuracy, and quality for downstream applications.

- Collaborate with data architects, analysts, and business teams to translate business requirements into scalable big data solutions.

- Perform data validation, quality checks, and troubleshooting to ensure pipeline reliability.

- Optimize data storage, retrieval, and performance across the Hadoop ecosystem.

- Ensure security, governance, and compliance across big data environments.

Required Skills & Qualifications :

- 2-10 years of hands-on experience in Big Data technologies.

- Strong expertise in the Hadoop ecosystem (HDFS, MapReduce, YARN).

- Proficiency in Apache Spark (batch and streaming) and Hive.

- Experience in designing and implementing data pipelines and ETL workflows.

- Solid understanding of data optimization, partitioning, and performance tuning for large datasets.

- Familiarity with NoSQL databases (HBase, Cassandra, MongoDB) is an added advantage.

- Experience with Java, Scala, Python, or Shell scripting.

- Strong analytical and problem-solving skills.

Preferred Qualifications :

- Bachelors/Masters degree in Computer Science, IT, or a related field.

- Experience with real-time data processing tools (Kafka, Flink, Storm).

- Exposure to data governance and security frameworks within big data environments.


info-icon

Did you find something suspicious?

Job Views:  
10
Applications:  9
Recruiter Actions:  1

Functional Area

Big Data / Data Warehousing / ETL

Job Code

1593256