Posted on: 04/12/2025
Description :
We are looking for an experienced Senior Data Engineer to design and implement scalable data architectures and AI-ready data products. The ideal candidate has deep expertise in the Databricks Lakehouse Platform, strong skills in AWS cloud services, and exposure to SAP data processing.
Key Responsibilities :
- Architect and build scalable data pipelines, models, and products (Databricks, AWS).
- Manage end-to-end data lifecycle and develop enterprise data models.
- Integrate data from SAP ECC/S/4HANA and non-SAP systems.
- Develop batch, real-time, and streaming data solutions.
- Implement data governance, security, quality, and observability.
- Optimize performance and cost across platforms.
- Collaborate with cross-functional teams to deliver enterprise-ready data solutions.
Required Skills :
- 10+ years in data engineering/data architecture.
- Strong expertise in Databricks (Delta Lake, Medallion Architecture, DLT, PySpark, SQL Warehouse, Unity Catalog).
- Proficiency in Python, SQL, PySpark.
- AWS experience : S3, Lambda, EMR, Redshift, Bedrock.
- Data modeling (ER & Dimensional) and metadata management.
- CI/CD and DevOps awareness.
Preferred Skills :
- SAP S/4HANA data extraction (DataSphere, SLT, BDC).
- ABAP, CDS views, manufacturing domain experience.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1584066
Interview Questions for you
View All