Posted on: 30/12/2025
Description :
Senior Product Development Engineer
Experience : 5+ Years
Role Summary :
The Senior Product Development Engineer is a core technical role at Agivant, an AI-first engineering services firm. You will be responsible for architecting and building the backbone of high-performance, distributed systems that enable real-time data processing at scale. This role demands a deep understanding of parallel processing and distributed systems design, focusing on high-availability (HA) data ingestion from diverse sources like Kafka, Iceberg, and major Cloud Storage providers. You will bridge the gap between complex design specifications and production-ready code, ensuring cross-region replication and robust system monitoring. As a senior engineer, you will champion an Agile, AI-first approach to reconfigure strategy and structure, driving the exponential growth of our clients through technology-led innovation.
Responsibilities :
- Distributed Core Development : Design and implement robust, scalable software solutions for distributed systems using Java, C++, or Golang, focusing on parallel processing architectures.
- Data Ingestion Engineering : Architect high-throughput data ingestion pipelines from cloud storage (S3/Azure/GCS), relational databases (Snowflake, BigQuery, PostgreSQL), and data lakehouses (Apache Iceberg).
- Kafka Ecosystem Management : Lead the configuration and customization of Kafka, Kafka Connect, and Kafka Streams, ensuring high-security standards and efficient event-driven communication.
- High-Availability (HA) Design : Implement strategies for HA loading and cross-region replication to ensure zero-downtime and data residency compliance.
- Spark Connectivity : Develop and optimize Spark Connectors to facilitate seamless data processing between distributed storage and compute layers.
- System Monitoring : Design and integrate comprehensive loading monitoring and automated error-reporting systems to maintain optimal cluster health.
- Third-Party Orchestration : Manage and tune critical third-party infrastructure components, including Kafka clusters and associated connectors.
- Agile Execution : Actively participate in the full Agile development lifecycle, ensuring continuous integration and delivery (CI/CD) of high-quality product features.
- Architecture Design : Translate complex design specifications into efficient technical implementations, emphasizing modularity and system performance.
Technical Requirements :
- Distributed Systems : Proven experience in building and scaling distributed systems applying complex parallel processing logic.
- Event-Driven Architecture : Mastery of event-driven systems and streaming technologies, specifically Kafka (including Zookeeper, Connect, and Security protocols).
- Big Data Stack : Hands-on experience with Apache Spark and the development of custom Spark connectors for high-speed data transfer.
- Cloud & Databases : Proficiency in cloud-native data stores (S3/Azure/GCS) and enterprise data platforms like Snowflake or Google BigQuery.
- Modern Tooling : Strong familiarity with CI/CD pipelines and modern software development workflows.
- Core Languages : Deep expertise in at least one systems-level language : Java, C++, or Golang.
Preferred Skills :
- Protocols & Concurrency : Experience with the gRPC protocol and advanced multi-threading techniques.
- Service Discovery : Familiarity with Zookeeper, ETCD, or Consul for cluster coordination.
- Distributed Algorithms : Theoretical and practical knowledge of distributed consensus algorithms like Paxos or Raft.
- Containerization : Experience with Docker and Kubernetes for managing containerized microservices.
- Lakehouse Knowledge : Hands-on experience with Apache Iceberg or similar table formats for massive data sets.
- Analytical Mindset : Excellent problem-solving skills with a focus on system optimization and performance tuning.
Did you find something suspicious?
Posted by
Posted in
Backend Development
Functional Area
Backend Development
Job Code
1595574