Description :
Responsibilities :
- Lead technical initiatives and guide the team to develop innovative software solutions that address complex challenges.
- Build scalable, efficient, and high-performance pipelines and workflows for processing large volumes of batch and real-time data.
- Maintain and enhance existing software systems to ensure performance and reliability.
- Recommend and implement technology upgrades to drive continuous improvement.
- Support real-time streams, ETL pipelines, data warehouses, and reporting services.
- Design and develop data frameworks, applications, and microservices that seamlessly integrate with other services.
- Utilize Big Data tools such as Kafka, AWS S3 Data Lake, EMR, and Spark to ingest, store, transform, and query data.
- Adhere to coding best practices, including unit testing, design/code reviews, and comprehensive documentation.
- Conduct thorough code reviews to maintain quality, mentor junior team members, and promote continuous learning.
- Perform performance analyses and capacity planning for each release.
- Work effectively as part of an Agile team, contributing to process improvements and innovative solutions.
- Implement and promote security protocols and data governance standards across development projects.
- Proactively introduce new approaches to overcome software challenges throughout the product lifecycle.
Requirements :
- Strong software design skills with a deep understanding of design patterns and performance optimization.
- Expertise in writing high-quality, well-structured Scala code with an emphasis on functional programming and test-driven development.
- Ability to produce clear, concise, and organized documentation.
- Knowledge of Amazon cloud computing services (Aurora MySQL, DynamoDB, EMR, Lambda, Step Functions, and S3).
- Excellent communication skills and the ability to collaborate effectively with team members of varying technical backgrounds.
- Proficiency in conducting detailed code reviews focused on improving code quality and mentoring developers.
- Familiarity with software engineering and project management tools.
- Commitment to following security protocols and best practices in data governance.
- Capability to construct KPIs and use metrics for continuous process improvement.
Minimum qualifications :
- 15+ years of experience designing and developing enterprise-level software solutions.
- 5+ years of experience with large volume data processing and Big Data tools such as Apache Spark, Scala, Hadoop, and Snowflake.
- 5+ years of experience developing Scala/Java applications and microservices using Spring Boot.
- 5+ years of experience working with SQL and relational databases.
- 2+ years of experience working within Agile/Scrum environments.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1560276
Interview Questions for you
View All