Posted on: 07/08/2025
Job Summary :
We are seeking a highly skilled Technical Architect with deep expertise in real-time data streaming platforms, especially within the Apache Kafka ecosystem, to lead the architecture and implementation of scalable, secure, and high-throughput event-driven systems. This role involves end-to-end solution design and optimization of streaming data pipelines using technologies like Kafka, Kafka Connect, Spark Streaming, Flink, Beam, and cloud-native tools such as Google Cloud Pub/Sub.
You will work with cross-functional teams, including data engineers, DevOps, product managers, and cloud architects, to define robust and high-performance streaming architectures that power data-intensive applications and analytics.
Key Responsibilities :
Kafka Architecture & Management :
- Design and architect scalable, fault-tolerant, and high-throughput streaming systems using Apache Kafka.
- Build and manage Kafka clusters for high availability, scalability, disaster recovery, and performance.
- Optimize cluster configuration parameters, partitioning strategies, consumer group behavior, and replication factors.
Streaming Pipeline Development :
- Architect and integrate Kafka Streams, Kafka Connect, Apache Flink, Beam, and Spark Streaming to enable complex real-time data processing pipelines.
- Design end-to-end pipelines that support ingestion, transformation, enrichment, validation, and delivery of streaming data across systems.
Platform Integration & Ecosystem :
- Implement and manage integrations with Schema Registry, Kafka Connect connectors, GCP Pub/Sub, and third-party systems.
- Drive seamless data interoperability across operational databases, data lakes, cloud services, and analytics platforms.
Security, Monitoring, and Compliance :
- Enforce security best practices for Kafka, including authentication, authorization (ACLs), TLS encryption, and audit logging.
- Implement observability solutions using monitoring tools (e.g., Prometheus, Grafana, Confluent Control Center, etc.) to track cluster health, latency, throughput, lag, and performance.
- Ensure compliance with enterprise policies and governance standards.
Collaboration & Leadership :
- Translate business and technical requirements into scalable Kafka-based architectural solutions.
- Collaborate with development and DevOps teams to automate Kafka deployments using infrastructure-as-code (e.g., Terraform, Ansible, Helm).
- Serve as a subject matter expert, mentor, and advisor for engineering teams on Kafka and stream processing best practices.
- Stay current with industry trends, contribute to architectural roadmaps, and evaluate emerging technologies in the real-time data space.
Required Technical Skills :
- Apache Kafka (core) architecture, broker configuration, topic management, partitioning, replication, consumer groups, offset management.
- Kafka Connect development and management of source/sink connectors (custom and Confluent-provided).
- Schema Registry schema design (Avro/Protobuf), evolution, compatibility settings.
- Kafka Streams, Apache Flink, Beam, Spark Streaming hands-on experience in real-time stream processing.
- Cloud Platforms especially GCP (Google Cloud Pub/Sub, Dataflow, BigQuery); exposure to AWS or Azure is a plus.
- Strong understanding of distributed systems, event-driven architecture, message serialization formats (Avro, Protobuf, JSON), and data consistency models.
- Hands-on with monitoring, logging, and alerting tools for real-time systems.
- Proficient in Java or Scala; Python is a bonus.
- Experience with containerization and orchestration tools like Docker, Kubernetes, and Helm.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1526565
Interview Questions for you
View All