HamburgerMenu
hirist

Job Description

Job Description :


- Design, build, and maintain ETL/ELT data pipelines for finance data migration from SAP R/3 to S/4HANA

- Work extensively with PySpark and Hadoop-based environments for large-scale data processing

- Orchestrate and monitor workflows using Apache Airflow

- Collaborate with data analysts to implement transformation logic and data mappings

- Coordinate with BI teams to ensure downstream data readiness

- Integrate SAP and non-SAP data sources into AWS (S3, Postgres)

- Support data validation, functional testing, and UAT cycles

- Participate in daily stand-ups and global (India & US) planning calls

- Troubleshoot pipeline issues and optimize performance

- Ensure data quality, governance, and consistency

Required Skills & Qualifications :


- 5+ years of experience in Data Engineering

- Strong hands-on experience with PySpark, Hadoop

- 2+ years of experience with Apache Airflow

- Strong knowledge of ETL/ELT best practices

- 2+ years of AWS experience (S3, Postgres)

- Proficiency in SQL and Python

- Strong communication and cross-functional collaboration skills

- SAP Finance data experience (GL, AP/AR, CO, Fixed Assets) Good to Have

Education :


- UG : Any Graduate

Key Skills :
- Data Engineering, PySpark, Hadoop, Airflow, AWS, SQL, Python, ETL, Data Warehousing, SAP Finance


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in