Posted on: 03/09/2025
Job Description :
Key Responsibilities :
- Design, develop, and maintain high-performance, scalable applications using Java and Big Data technologies
- Build and manage data pipelines to process structured and unstructured data from multiple sources
- Develop and maintain microservices, RESTful APIs, and other distributed systems
- Work with modern Big Data tools and technologies such as :
1. Apache Spark
2. HDFS
3. Ceph Storage
4. Solr / Elasticsearch
5. Apache Kafka
6. Delta Lake
- Write clean, efficient, and well-documented code
- Collaborate with cross-functional teams including Data Engineers, QA, and DevOps
- Participate in code reviews and contribute to continuous improvement of development processes
- Mentor and support junior developers in the team
Required Skills & Qualifications :
- Bachelors degree in Computer Science, Information Technology, or a related field
- 2- 4 years of hands-on experience in Java development with exposure to Big Data ecosystems
- Strong knowledge of core Java, multithreading, and object-oriented programming
- Experience in building data processing pipelines and working with large-scale datasets
- Familiarity with Big Data components like Spark, HDFS, Kafka, Ceph, Delta Lake
- Understanding of REST APIs, microservices architecture, and distributed systems
- Strong problem-solving and analytical skills
- Good verbal and written communication skills
- Ability to work in a fast-paced, collaborative environment
Preferred Skills (Nice to Have) :
- Exposure to cloud platforms (AWS, Azure, GCP) for Big Data deployments
- Experience with NoSQL databases or search engines like Solr/Elasticsearch
- Familiarity with CI/CD tools and agile development practices
Did you find something suspicious?
Posted By
Posted in
Backend Development
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1539525
Interview Questions for you
View All