HamburgerMenu
hirist

Senior Python Developer - Apache Flink

SRM360 CONSULTING
Multiple Locations
8 - 12 Years

Posted on: 02/10/2025

Job Description

Job Description :


Key Responsibilities :


- Design, develop, and maintain large-scale, real-time data pipelines using Apache Flink, Apache Kafka, and Apache Spark.

- Implement robust, fault-tolerant, and scalable distributed systems to handle high-volume streaming and batch data workloads.

- Write clean, efficient, and production-ready Python code, including performance debugging and optimization.

- Collaborate with cross-functional teams including Data Scientists, Product Engineers, and DevOps to deliver end-to-end data solutions.

- Architect and implement event-driven systems and streaming data frameworks to support analytics, reporting, and real-time decision-making.

- Perform system monitoring, troubleshooting, and fine-tuning to ensure low-latency, high-throughput performance.

- Evaluate and integrate new open-source tools, frameworks, and best practices to improve system efficiency and scalability.

- Document system designs, processes, and implementation details to support long-term maintainability.

- Provide mentorship and technical guidance to junior engineers, ensuring adherence to coding standards and best practices.


Must-Have Skills :


- Python : Advanced programming and debugging skills.

- Apache Flink : In-depth, hands-on implementation experience (Mandatory).

- Apache Kafka : Strong understanding and proven experience in distributed messaging systems (Mandatory).

- Apache Spark : Good working knowledge and experience in developing data pipelines.

- Distributed Systems : Hands-on experience building and managing scalable, multi-node distributed systems using open-source frameworks.


Nice-to-Have Skills :


- Apache Ignite : Prior exposure or hands-on experience will be considered a strong plus.

- Knowledge of Kubernetes/Docker for deploying distributed applications.

- Familiarity with cloud environments (AWS, GCP, or Azure) for building and scaling big data systems.


You Should Be :


- Proficient in building robust, real-time data pipelines and event-driven architectures.

- Comfortable working with high-performance, scalable, and low-latency systems.

- Capable of troubleshooting, profiling, and optimizing distributed applications effectively.

- A strong communicator, able to explain complex technical concepts to both technical and non-technical stakeholders.

- Self-motivated and capable of working independently in a remote setup while contributing to a collaborative team culture.


Qualifications :


- Bachelors or Masters degree in Computer Science, Data Engineering, or related technical discipline.

- 8- 12 years of experience in data engineering, with at least 2- 3 years in building real-time streaming

systems.


- Proven experience with distributed systems in production environments


info-icon

Did you find something suspicious?