Posted on: 14/07/2025
Must have technical Skills :
1. Expertize and hands-on experience on Spark DataFrame, and Hadoop echo system components
2. Good and hand-on experience- of any of the Cloud (AWS/Azure/GCP)
3. Good knowledge of PySpark (Spark SQL)
Good to have technical Skills :
1. Good knowledge of Shell script & Python
2. Good knowledge of SQL
3. Good knowledge of migration projects on Hadoop
4. Good Knowledge of one of the Workflow engine like Oozie, Autosys
5. Good knowledge of Agile Development- Good to Have
6. Passionate about exploring new technologies
7. Automation approach
8. Data Ingestion, Processing and Orchestration knowledge
Roles & Responsibilities :
1. Lead technical implementation of Data Warehouse modernization projects for Impetus
2. Design and development of applications on Cloud technologies
3. Lead technical discussions with internal & external stakeholders
4. Resolve technical issues for team
5. Ensure that team completes all tasks & activities as planned
6. Code Development.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1512793
Interview Questions for you
View All