Posted on: 23/11/2025
Description :
About the Role :
We are looking for a Senior Data Architect to join our Data, AI & GenAI practice at Rapyder.
The role demands deep technical expertise in data warehousing and ETL, hands-on cloud migration experience, and the ability to translate business needs into scalable data architectures.
You will collaborate closely with sales, customers, and delivery teams to design enterprise-grade data solutions that deliver measurable business outcomes.
Roles & Responsibilities :
Must Have Technical Skills :
- Expertise in ETL tools such as Azure Data Factory (ADF), Informatica, AWS Glue, or similar.
- Hands-on experience in designing & implementing Data Warehousing solutions.
- Proven track record of building at least two Enterprise Data Warehouses from scratch.
- Strong experience in data migration from on-premise to cloud platforms.
- Ability to design ETL/ELT processes, create data flows, and resolve technical challenges.
- Translate business vision into ETL solution architecture with clarity and precision.
Must Have Communication & Collaboration :
- Strong oral and written communication skills.
- Ability to present complex technical topics to large audiences.
- Experience working directly with customers and sales teams to define solutions and project designs.
Good to Have Technical Skills :
- Knowledge of Azure SQL Database, Data Warehouse, Snowflake, Synapse, or Redshift.
- Familiarity with scripting languages (Shell, PowerShell, Perl, Python, Scala).
- Understanding of data governance best practices and master data management.
- Exposure to maturity assessments for existing DW/ETL solutions.
- Strong knowledge of relational and dimensional modeling.
- Competency in data security techniques, including anonymization.
- Experience in defining cloud migration strategies.
Good to Have Research & Development :
- Proactive in research and POCs, evaluating alternative solutions, and preparing technical reports.
Requirements :
- Experience: Minimum 10 years in IT, with at least 5 years in Data Warehouse & ETL technologies.
- Minimum 23 years of hands-on work on cloud platforms (AWS/Azure) using services such as Azure Data Factory v2, Synapse, Redshift, Databricks, Snowflake, or Talend.
- Expertise in at least one OEM (AWS or Azure)
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1578809
Interview Questions for you
View All