Posted on: 09/12/2025
Description :
Our client is a Global Edutech Product organization. They are looking for suitable candidate for below role.
Location : Bangalore.
Position : Principal Software Engineer (Scala Big Data).
14+ Years.
Primary Skills : Strong Scala, Spark, Strong ETL, SQL, Performance tuning , AWS.
Notice Period : Upto 45 days.
Summary :
Join our innovative educational technology organization as a Principal Software Engineer. Leverage your expertise in Scala, Spark, Snowflake, databases, and Big Data to architect and deliver scalable, impactful software solutions. In this role, you'll lead solution engineering efforts, drive new platform and product developments, analyze and enhance system architecture, and collaborate with product managers to plan and execute smooth feature rollouts within an Agile environment.
Essential duties/responsibilities :
- Build scalable, efficient, and high-performance pipelines and workflows for processing large volumes of batch and real-time data.
- Maintain and enhance existing software systems to ensure performance and reliability.
- Recommend and implement technology upgrades to drive continuous improvement.
- Support real-time streams, ETL pipelines, data warehouses, and reporting services.
- Design and develop data frameworks, applications, and microservices that seamlessly integrate with other services.
- Utilize Big Data tools such as Kafka, AWS S3 Data Lake, EMR, and Spark to ingest, store, transform, and query data.
- Adhere to coding best practices, including unit testing, design/code reviews, and comprehensive documentation.
- Conduct thorough code reviews to maintain quality, mentor junior team members, and promote continuous learning.
- Perform performance analyses and capacity planning for each release.
- Work effectively as part of an Agile team, contributing to process improvements and innovative solutions.
- Implement and promote security protocols and data governance standards across development projects.
- Proactively introduce new approaches to overcome software challenges throughout the product lifecycle.
Required job skills :
- Expertise in writing high-quality, well-structured Scala code with an emphasis on functional programming and test-driven development.
- Ability to produce clear, concise, and organized documentation.
- Knowledge of Amazon cloud computing services (Aurora MySQL, DynamoDB, EMR, Lambda, Step Functions, and S3).
- Excellent communication skills and the ability to collaborate effectively with team members of varying technical backgrounds.
- Proficiency in conducting detailed code reviews focused on improving code quality and mentoring developers.
- Familiarity with software engineering and project management tools.
- Commitment to following security protocols and best practices in data governance.
- Capability to construct KPIs and use metrics for continuous process improvement.
Minimum qualifications :
- 10+ years of experience with large volume data processing and Big Data tools such as Apache Spark, Scala, Hadoop and Snowflake.
- 5+ years of experience developing Scala/Java applications and microservices using Spring Boot.
- 5+ years of experience working with SQL and relational databases.
- 2+ years of experience working within Agile/Scrum environments.
Preferred qualifications :
- Extended experience with Amazon cloud computing infrastructure.
- Background in the educational technology domain.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1587491
Interview Questions for you
View All