Posted on: 09/12/2025
Role : Senior Data Engineer (Kafka & AWS)
Exp : 6-9 Years
Location : Pune
- Location : Pune
- Work mode : In-office (no wfh or hybrid)
- Mandatory skills : Aws, Kafka, Python
Responsibilities :
- Develop and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and
AWS services.
- Configure and manage Kafka connectors, ensuring seamless data flow and integration across
systems.
- Demonstrate strong expertise in the Kafka ecosystem, including producers, consumers,
brokers, topics, and schema registry.
- Design and implement scalable ETL/ELT workflows to efficiently process large volumes of data.
- Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and
Glue.
- Implement robust monitoring, testing, and observability practices to ensure reliability and
performance of data platforms.
- Uphold data security, governance, and compliance standards across all data operations.
Requirements :
- Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
- Proficient in coding with Python, SQL, and Java with Java strongly preferred.
- Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
- Excellent problem-solving, communication, and collaboration skills.
- Flexibility to write production-quality code in both Python and Java as required.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1586990