Posted on: 03/02/2026
Description :
Role Overview :
We are seeking a highly skilled Senior Data / Platform Engineer with strong expertise in Python, Kafka, and Snowflake to design, build, and optimize large-scale data ingestion and streaming pipelines.
This role requires hands-on engineering proficiency, platform collaboration, and the ability to work closely with onshore teams to deliver high-quality, scalable data solutions.
Primary Skills (Must-Have) :
Python :
- Strong hands-on experience developing and maintaining data ingestion pipelines.
- Ability to write production-grade, modular, and testable Python code.
- Familiarity with building reusable libraries, error handling, and logging frameworks.
Kafka :
- Experience working with Kafka producers, consumers, topics, and partitions.
- Proven background designing and supporting Kafka ? Snowflake streaming data flows.
- Understanding event schemas, offset management, retries, and failure handling.
Secondary / Supporting Skills :
- Snowflake
- Experience with Snowflake tasks, streams, pipelines, connectors, and ingestion mechanisms.
- Understanding of data loading patterns: batch, micro-batch, and real-time streaming.
- Workflow Orchestration
Hands-on experience with Apache Airflow for :
- DAG orchestration
- Dependency management
- Retries & alerting configurations
Platform Collaboration :
- Comfortable working with Java-based microservices or distributed platforms.
- Ability to align ingestion pipelines with upstream application changes and release cycles.
Key Responsibilities :
- Design, build, and maintain Python-based ingestion pipelines targeting Snowflake.
- Implement, optimize, and support Kafka ? Snowflake streaming and ingestion workflows.
- Develop, schedule, and manage Airflow DAGs to orchestrate data processes.
- Monitor pipeline health, troubleshoot failures, and resolve issues related to:
1. Latency,
2. Reliability,
3. Data quality
- Collaborate closely with the onshore engineering team for architecture alignment and faster delivery.
- Enhance observability, performance, and fault tolerance of data pipelines.
- Follow data engineering best practices including, Structured logging, Alerts & monitoring, Robust error handling Automated testing
Preferred Qualifications :
- 5 to 8+ years of experience as a Data Engineer or Platform Engineer.
- Strong understanding of distributed systems and event-driven architectures.
- Experience in cloud-based environments (AWS/Azure/GCP) is a plus.
- Familiarity with CI/CD pipelines for data engineering workflows.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1609180