Posted on: 11/08/2025
Key Responsibilities :
- Design and implement robust, scalable Snowflake data solutions.
- Lead enterprise-scale Snowflake implementations, including greenfield projects and migrations.
- Develop and manage data pipelines using DBT, Airflow, and ETL tools like Fivetran or Matillion.
- Optimize Snowflake performance through best practices around architecture, micro-partitioning, resource
management, etc.
- Collaborate with cross-functional teams to define data architecture, governance, and integration strategies.
- Participate in client meetings, ensuring effective communication and stakeholder alignment across project phases.
Technical Requirements
- Minimum 7+ years of experience in Data Engineering / BI platforms.
- 4-5+ years of hands-on experience with Snowflake.
- In-depth understanding of Snowflake architecture (warehouses, RBAC, micro-partitioning, data sharing, etc.
- Strong command of SQL, including advanced concepts like window functions and dynamic SQL.
Experience with :
- DBT (Data Build Tool)
- Apache Airflow
- ETL tools (e.g., Fivetran, Matillion)
- Cloud services, especially AWS
- Scripting languages such as Python or Scala
- Experience with large-scale data system design, implementation, and performance tuning.
- SnowPro Advanced Certification is a plus.
Soft Skills :
- Excellent communication and interpersonal skills.
- Proven ability to lead end-to-end implementation projects (architecture, design, UAT, go-live).
- Comfortable working with international stakeholders, especially US-based clients.
- Self-driven, reliable, and detail-oriented with strong problem-solving skills
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1527451
Interview Questions for you
View All