HamburgerMenu
hirist

Software Developer - AWS & Kafka - Data Pipeline

ForeFuture Management Consultants
Multiple Locations
8 - 12 Years

Posted on: 04/08/2025

Job Description

Responsibilities :

- Develop, and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.

- Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.

- Demonstrate a strong understanding of the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.

- Design and implement scalable ETL/ELT workflows to process large volumes of data efficiently.

- Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.

- Implement robust monitoring, testing, and observability practices to ensure data platform reliability and performance.

- Uphold data security, governance, and compliance standards across all data operations.

Requirements :

- Minimum of 8 years of experience in data engineering or related roles.

- Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).

- Proficient in coding with Python, SQL, and Java (Java strongly preferred). Person needs to flexible to write code in Python/Java

- Experience with infrastructure-as-code tools (e.g. CloudFormation) and CI/CD pipelines.

- Excellent problem-solving skills and strong communication and collaboration abilities.

info-icon

Did you find something suspicious?