Posted on: 14/07/2025
Job Title : Snowflake Data Engineer
Location : PAN India / Remote / Hybrid (customizable)
Experience Required : 5+ Years (with strong Snowflake expertise)
Employment Type : Full-Time / Contract
Start Date : Immediate
Budget : Open
Job Summary :
We are looking for a highly skilled Snowflake Data Engineer with at least 5 years of experience in building and optimizing data pipelines, transforming data, and enabling advanced analytics solutions. The ideal candidate should have hands-on experience with Snowflake, SQL, ETL frameworks, and working in cloud-based environments.
Key Responsibilities :
- Migrate data from legacy systems or other cloud platforms to Snowflake.
- Build scalable ETL/ELT workflows using tools like DBT, Apache Airflow, Matillion, or Informatica.
- Develop complex SQL queries, stored procedures, and user-defined functions for data transformation.
- Optimize data models and queries for performance and cost.
- Implement data governance, security policies, data masking, and access controls in Snowflake.
- Integrate Snowflake with BI tools (Tableau, Power BI, Looker) and data sources.
- Collaborate with Data Architects, Analysts, and other Engineering teams to deliver quality data solutions.
- Monitor and troubleshoot data pipelines, resolve issues in real-time, and ensure data reliability.
Required Skills :
- Strong SQL skills with ability to write complex queries and optimize performance.
- Experience with Snowflake features : Virtual Warehouses, Streams, Tasks, Cloning, Time Travel, Data Sharing.
- Knowledge of ETL/ELT processes and tools such as DBT, Apache Airflow, Matillion, or Informatica.
- Experience working in cloud environments : AWS, Azure, or GCP (preferably AWS).
- Familiarity with data warehousing concepts, dimensional modeling, and normalization.
- Experience with scripting (Python, Bash) for automation and orchestration.
Preferred Qualifications :
- Bachelor's or Masters degree in Computer Science, Engineering, or a related field.
- Snowflake certification (e.g., SnowPro Core Certified).
- Familiarity with CI/CD pipelines and DevOps practices in data projects.
- Experience integrating Snowflake with Kafka, S3, Kinesis, or other data ingestion tools.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1512911
Interview Questions for you
View All