Posted on: 29/10/2025
Description :
About the job
Experience : 8+ Years
Location Options :
- Bay Area (Local Candidates Preferred) - Travel on need basis
- Hyderabad (Local Candidates Only) - 5 days onsite
Employment Type : Full-time / Contract
Role Summary :
As a Snowflake Subject Matter Expert (SME), you will be responsible for architecting, developing, and optimizing data warehouse and lakehouse solutions using the Snowflake platform.
Your expertise will enable the data team to build a modern, scalable, and high-performance data platform that supports business analytics, data science, and reporting needs.
You will work closely with data architects, engineers, and business stakeholders to ensure robust data ingestion, storage, and access frameworks that meet security, compliance, and performance standards.
Key Responsibilities :
- Design and implement Snowflake data warehouse and lakehouse architectures.
- Develop and optimize Snowflake SQL, stored procedures, and data transformation logic.
- Manage Snowflake account configurations, resource monitors, virtual warehouses, and access controls.
- Integrate Snowflake with Databricks, Azure Data Factory, and cloud storage (ADLS/S3/GCS).
- Implement best practices for data partitioning, clustering, and caching for performance optimization.
- Participate in data ingestion automation, metadata management, and pipeline monitoring.
- Collaborate with security and governance teams to enforce RBAC, encryption, and compliance policies.
- Contribute to CI/CD automation for Snowflake deployments and pipeline orchestration.
- Provide technical leadership, mentoring, and knowledge sharing across teams.
Required Skills & Qualifications :
- 8+ years of experience in Data Engineering, with 3+ years in Snowflake.
- Deep expertise in Snowflake features :
- Strong SQL and ETL/ELT development skills.
- Experience with data modeling (Kimball/Inmon/Dimensional).
- Hands-on experience with :
- Databricks or PySpark for transformation
- Azure Data Factory / Airflow / DBT for orchestration
- Cloud storage (Azure Data Lake, S3, or GCS)
- Knowledge of data governance, RBAC, encryption, and compliance frameworks (GDPR, HIPAA, etc.
- Familiarity with CI/CD pipelines (Azure DevOps, GitHub Actions, Jenkins).
- Strong problem-solving and communication skills.
Preferred / Good To Have :
- Experience with Snowpipe and auto-ingestion.
- Exposure to Delta Lake and Unity Catalog.
Certifications :
- SnowPro Core / Advanced Architect / Data Engineer
- Azure / AWS Data Engineer certifications.
Skills : azure, Python, Sql, Snowflake
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1567115
Interview Questions for you
View All