Posted on: 13/03/2026
Job Description :
Roles :
- Set up and run Hadoop development frameworks.
- Collaborate with a team of business domain experts, data scientists, and application developers to identify relevant data for analysis and develop the Big Data solution.
- Explore and learn new technologies for creative business problem-solving.
Job Requirement :
- Ability to develop and manage scalable Hadoop cluster environments
- Ability to design solutions for Big Data applications
- Experience in Big Data technologies like HDFS, Hadoop, Hive, Yarn, Pig, HBase, Sqoop, Flume, etc
- Working experience on Big Data services in any cloud-based environment.
- Experience in Spark, Pyspark, Python or Scala, Kafka, Akka, core or advanced Java, and Databricks
- Knowledge of how to create and debug Hadoop and Spark jobs
- Experience in NoSQL technologies like HBase, Cassandra, MongoDB, Cloudera, or Hortonworks Hadoop distribution
- Familiar with data warehousing concepts, distributed systems, data pipelines, and ETL
- Familiar with data visualization tools like Tableau
- Good communication and interpersonal skills
- Minimum 6+ years of Professional experience with 3+ years of Big Data project experience
- B.Tech/B.E from reputed institute preferred
Did you find something suspicious?
Posted by
Posted in
Data Analytics & BI
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1620273