Posted on: 17/12/2025
Description :
We are seeking a Senior Data Engineer
Job Functions & Responsibilities :
- Design and implement data pipelines across AWS, Azure, and Google Cloud.
- Develop SAP BTP integrations with cloud and on-premise systems.
- Ensure seamless data movement and storage between cloud platforms.
- Develop and optimize ETL workflows using Pentaho and Microsoft ADF /or equivalent ETL tools.
- Design scalable and efficient data transformation, movement, and ingestion processes.
- Monitor and troubleshoot ETL jobs to ensure high availability and performance.
- Develop and integrate RESTful APIs to support data exchange between SAP and other platforms.
- Work with API gateways and authentication methods like OAuth, JWT, API keys.
- Implement API-based data extractions and real-time event-driven architectures.
- Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration.
- Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency.
- Support data transformation logic and business rules for ERP reporting needs.
- Work with Ataccama and Collibra to define and enforce data quality and governance policies.
- Implement data lineage, metadata management, and compliance tracking across systems.
- Ensure compliance with enterprise data security and governance standards.
- Utilize Azure DevOps and GitHub for version control, CI/CD, and deployment automation.
- Deploy and manage data pipelines on AWS, Azure, and Google Cloud.
- Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
- Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements.
- Document ETL workflows, API specifications, data models, and governance policies.
- Provide technical support and troubleshooting for data pipelines and integrations.
Required Skills & Experience :
- 7+ years of experience in Data Engineering, ETL, and SQL development.
- Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations.
- Strong expertise in Pentaho (PDI), Microsoft ADF, and API development.
- Proficiency in SQL (stored procedures, query optimization, performance tuning).
- Experience working with Azure DevOps, GitHub, and CI/CD for data pipelines.
- Good understanding of data governance tools (Ataccama, Collibra) and data quality management.
- Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows.
- Strong problem-solving skills and ability to work independently in a fast-paced environment
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1591807
Interview Questions for you
View All