Posted on: 20/09/2025
Job Profile :
- Job scheduling using Oozie or Airflow or any other ETL scheduler
- Analyze, re-architect and re-platform on-premise data warehouses to data platforms on AWS cloud using native or 3rd party services.
- Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
- Designing & delivering data analytics solutions using AWS Cloud native services.
- Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design
- Waterfall, Agile, Scrum and similar project delivery methodologies.
- Internal as well as external stakeholder management
- MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage.
- AWS Solution Architect or AWS Data Specialty certification will be added advantage.
- Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Pyspark, Scala, Python, Hive etc.
You will contribute to the role by :
- Breaking through barriers : Helping create better customer experience by taking the data first approach and deliver Insights
- Adapting to anything : Agility to reacting and responding to new business priorities and market conditions and customer opportunities with rapidly deployable solutions
- Innovating anywhere : Solving problems with powerful solutions enabling Inter operable solutions across multiple lines of business
Candidates Profile :
- Ready to work at NOIDA/Hbad for Perm Role
- Can join within 15 days
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1549106
Interview Questions for you
View All