Posted on: 22/09/2025
Key Responsibilities :
- Design and architect scalable, secure, and high-performing data solutions using Snowflake, DBT and Airflow.
- Translate business requirements into technical data models and solutions in collaboration with stakeholders, data engineers, and analysts.
- Define and enforce best practices for data modeling, ELT pipeline development, and metadata management.
- Provide architectural guidance and mentorship to data engineering teams.
- Lead technical reviews, architecture governance, and data quality assessments.
- Ensure adherence to data privacy, security, and compliance regulations.
- Monitor, troubleshoot, and optimize the performance of Snowflake workloads, Airflow DAGs and DBT models.
- Evaluate and recommend new tools, frameworks, and architecture strategies
Required Skills & Qualifications :
Proven hands-on experience with :
1. Snowflake : Data warehouse architecture, query optimization, role-based access control
2. DBT : Model creation, testing, documentation, and version control
3. Apache Airflow : DAG development, orchestration, monitoring, scheduling
- Strong command of SQL, data modelling (Star/Snowflake schemas), and ETL/ELT processes
- Proficiency in Python and scripting for automation and data pipeline development
- Experience with at least one major cloud platform (AWS, Azure, or GCP)
- Familiarity with CI/CD practices for data deployments
- Excellent communication, leadership, and analytical problem-solving skills
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1550084
Interview Questions for you
View All