Posted on: 19/08/2025
Role : Data Engineer
Experience : 6+ years
Location : Pune/Hyderabad
Work Mode : WFO
Job Profile :
- ETL Orchestration and workflows (Airflow)
- Programming Python/Kotlin/Scala
- Streaming Kafka/Flink/Spark
- Database, Data modelling, Performance tuning, Querying (SQL & Presto)
- Data Lake/warehouse Snowflake
Roles and responsibilities :
- Own critical data systems that support multiple products/teams
- Develop, implement and enforce best practices for data infrastructure and automation
- Design, develop and implement large scale, high-volume, high-performance data models and pipelines for Data Lake and Data Warehouse
- Improve the reliability and scalability of our Ingestion, data processing, ETLs, Reporting tools and data ecosystem services
- Manage a portfolio of data pipelines that deliver high-quality, trustworthy data
Other specifications :
- 5+ years experience working in data platform and data engineering or a similar role
- Proficiency in programming languages such as Python/Kotlin/Scala
- 5+ years of experience in ETL orchestration and workflow management tools like Airflow
- Expert in database fundamentals, SQL, data reliability practices and distributed computing
- 5+ years of experience with the Distributed data/similar ecosystem (Spark, Presto) and streaming technologies such as Kafka/Flink/Spark Streaming
- Excellent communication skills and experience working with technical and non-technical teams and knowledge of reporting tools
- Comfortable working in fast paced environment, self-starter and self-organizing
Did you find something suspicious?
Posted By
Lakshmi
Lead Talent Acquisition Specialist at Siyaton Software Solutions Pvt Ltd
Last Active: NA as recruiter has posted this job through third party tool.
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1531860
Interview Questions for you
View All