Posted on: 21/07/2025
Job Description :
We are looking for an experienced Snowflake Engineer to join our data engineering team.
The ideal candidate will have in-depth expertise in data warehousing, ETL/ELT pipelines, and Snowflake architecture.
You will be responsible for building and optimizing scalable data solutions that support advanced analytics, BI, and enterprise data strategies.
Key Responsibilities :
- Develop and optimize views, stored procedures, and user-defined functions in Snowflake SQL.
- Build dimensional models and ensure data is structured for performance and scalability.
- Build ETL/ELT pipelines using tools like Informatica, Matillion, dbt, Apache Airflow, or native Snowflake tasks.
- Integrate diverse data sources (structured/unstructured) into Snowflake from cloud/on-prem systems.
- Optimize pipeline performance and manage data quality and lineage.
- Work with cloud platforms such as AWS, GCP, or Azure for Snowflake integration and storage
(e.g., S3, GCS, Blob Storage).
- Utilize Snowpipe, Streams, Tasks, and Stages for continuous data loading and transformation.
- Implement and manage data security, access control, RBAC, and data masking in Snowflake.
- Ensure compliance with organizational and regulatory data policies (GDPR, HIPAA, etc.).
- Tune query performance and warehouse sizing using Snowflakes performance optimization tools.
- Monitor cost usage and optimize resource consumption within Snowflake.
- Collaborate with data scientists, analysts, and BI teams to meet data requirements.
- Document technical designs, data flows, and processes for knowledge sharing and compliance.
Must-Have Skills :
- 2+ years of hands-on Snowflake development and administration
- Strong command of SQL, performance tuning, and Snowflake-specific features
- Hands-on experience with ETL tools (Informatica, Talend, dbt, Matillion, etc.)
- Familiarity with cloud services (AWS S3/Redshift, GCP BigQuery, Azure Synapse)
- Experience with data modeling, data partitioning, and columnar storage concepts
- Strong understanding of data governance, data quality, and security best practices
- Snowflake certification (e.g., SnowPro Core or Advanced Architect)
- Python or scripting experience for orchestration
- Experience with CI/CD pipelines for data deployments
- Exposure to streaming data (Kafka, Kinesis, etc.)
- Familiarity with BI tools like Tableau, Power BI, or Looker
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1516910
Interview Questions for you
View All