Posted on: 30/08/2025
What You'll Work On :
- Design, develop & deploy ETL/ELT pipelines (ADF, Databricks, SQL, Python)
- Use GitHub & Azure DevOps to enable CI/CD workflows
- Optimize data quality & transformations using Spark and Big Data platforms
- Create reusable, scalable data solutions across Lakehouse architecture
- Document dataflows & processes, act as L3 support for critical issues
- Collaborate in Agile teams and align deliverables across stakeholders
- Contribute to Data Engineering OKRs and knowledge sharing across global teams
You Bring :
- Solid knowledge of Data Warehousing, Lakehouse, Cloud, Semantic Layer, and Data Visualization
- Experience working in Agile environments
- Strong logical thinking, communication & collaboration skills
- Already serving notice period or available within 30 days
Did you find something suspicious?
Posted By
Posted in
DevOps / SRE
Functional Area
DevOps / Cloud
Job Code
1537783
Interview Questions for you
View All