HamburgerMenu
hirist

Job Description

Role : Senior Kafka Developer

Experience : 8+ years.


We are seeking an experienced Senior Kafka Developer to join our team. The ideal candidate will have strong hands-on experience with Apache Kafka and be adept at understanding data integration and processing requirements. This role will involve collaborating with business stakeholders to deliver high-quality, scalable, and performant data solutions. The selected candidate will be responsible for managing the full development cycle- from requirements gathering, solution design, and implementation to supporting testing, managing go-live, and providing post-deployment support. Additionally, you will mentor offshore development teams to ensure smooth delivery.

Key Responsibilities :


- Collaborate with Stakeholders : Engage with business stakeholders to gather and understand data integration and processing requirements, translating them into technical solutions.

- Solution Design : Design and architect efficient, scalable, and reliable data solutions using Apache Kafka, ensuring smooth integration with existing systems.

- Development and Implementation : Develop and implement Kafka-based data processing solutions, ensuring high performance and reliability.

- Testing and Deployment : Assist with testing activities, troubleshoot issues, and manage the deployment processes for Kafka-based solutions.

- Post-Go-Live Support : Provide support during the hypercare phase, troubleshooting and resolving issues to ensure smooth functionality.

- Mentorship : Guide and mentor offshore development teams, ensuring adherence to best practices, project timelines, and quality standards.

Technical Skills :

- Apache Kafka : In-depth experience in implementing and managing Kafka streams, ensuring efficient and reliable data processing.

- Programming : Proficiency in Python and Java for developing data processing solutions, with experience working in large-scale environments.

- Data Storage : Expertise in MS SQL and experience working with Hive, Sqoop, and other data storage solutions.

- Cloud Technologies : Experience with Azure Databricks and Spark for handling large-scale data processing.

- Real-Time Data Processing : Familiarity with Kinesis, Apache NiFi, and other tools for real-time data streaming and processing.

- Scripting : Solid experience in Unix Shell Scripting for automating workflows and managing data tasks.

- Experience with real-time data streaming and data integration platforms.

- Expertise in data architecture principles and performance optimization.

Soft Skills :

- Excellent analytical and problem-solving skills.

- Strong communication skills with the ability to interact effectively with both technical and

non-technical stakeholders.

- Ability to work independently and manage multiple tasks in a fast-paced environment.


info-icon

Did you find something suspicious?