Posted on: 25/09/2025
Key Responsibilities :
- Design, develop, and maintain data processing applications using Scala and Apache Spark.
- Collaborate with cross-functional teams to gather requirements and deliver scalable, efficient solutions.
- Implement test-driven development practices to improve reliability and maintainability.
- Manage deployment of artifacts from lower to higher environments, ensuring smooth transitions.
- Troubleshoot and optimize Spark performance issues for large-scale data processing.
- Actively participate in Agile ceremonies (sprint planning, development, reviews) and deliver high-quality, timely outcomes.
- Provide production support for critical data batches, including timely issue resolution and hotfixes.
Skills / Competencies :
- Proven expertise in Apache Spark (development and performance optimization).
- Solid understanding of data structures, algorithms, and design patterns for efficient data processing.
- Good knowledge of SQL and database concepts.
- Strong analytical and problem-solving skills.
- Familiarity with DevOps practices (CI/CD, deployment automation, monitoring) is a plus.
Primary Skills (Non-Negotiable) :
- Apache Spark (application development & optimization).
- Big Data ecosystem knowledge.
- Strong problem-solving & debugging skills.
- Agile delivery mindset.
Managers Note :
Scala coding is mandatory; candidates without solid Scala hands-on experience will not be considered.
Did you find something suspicious?
Posted By
Posted in
Backend Development
Functional Area
Backend Development
Job Code
1552253
Interview Questions for you
View All