Posted on: 18/07/2025
Job Title : Confluent Kafka Engineer
Location : Visakhapatnam / Remote
Company : Tetra Connects Pvt. Ltd.
Experience Required : 45 Years
Certification Required : Confluent Certified Developer / Administrator
Job Summary :
Tetra Connects Pvt. Ltd. is seeking a highly skilled and certified Confluent Kafka Engineer with 45 years of hands-on experience in building, managing, and scaling distributed streaming platforms. The ideal candidate will play a pivotal role in designing and developing real-time data streaming pipelines using Apache Kafka and Confluent Platform, ensuring optimal performance, scalability, and data reliability across systems.
Key Responsibilities :
- Design, deploy, and manage scalable Kafka clusters (on-premise or via Confluent Cloud).
- Develop, implement, and maintain reliable data streaming pipelines using Kafka producers, consumers, connectors, and related components.
- Optimize Kafka topics, partitions, and configurations for high throughput and low latency.
- Ensure high availability, fault tolerance, and data replication across the Kafka infrastructure.
- Integrate Kafka with enterprise systems and collaborate with cross-functional teams including data engineers, architects, and application developers.
- Monitor system health and performance; perform tuning, debugging, and root cause analysis as needed.
- Enforce data security, compliance, and governance policies including RBAC and encryption.
- Maintain technical documentation covering architecture, configurations, and operational best practices.
Required Skills & Qualifications :
- 45 years of relevant experience working with Apache Kafka and the Confluent Platform.
- Confluent Certification (Developer or Administrator) Mandatory.
- Strong experience with Kafka Streams, Kafka Connect, Schema Registry, and ksqlDB.
- Solid understanding of distributed systems, event-driven architecture, and real-time data processing.
- Proficiency in Java, Python, or Scala for Kafka application development.
- Hands-on experience with containerization (Docker) and basic familiarity with Kubernetes is a plus.
- Experience with monitoring tools such as Prometheus, Grafana, or Confluent Control Center.
- Good understanding of CI/CD pipelines and DevOps practices.
Preferred Qualifications :
- Experience working with Confluent Cloud and its enterprise-grade features.
- Exposure to alternative messaging systems like RabbitMQ, ActiveMQ, or AWS Kinesis.
- Working knowledge of cloud environments such as AWS, Azure, or Google Cloud Platform (GCP).
- Familiarity with Agile/Scrum development methodologies.
Employment Details :
- Employment Type : Full-Time
- Remuneration : As per industry standards
- Notice Period : Immediate joiners preferred (up to 30 days)
Did you find something suspicious?
Posted By
Posted in
DevOps / SRE
Functional Area
Data Engineering
Job Code
1514497
Interview Questions for you
View All