Posted on: 25/08/2025
Job Role - Snowflake Developer
Experience - 5-7 Years
Location - Trivandrum/Kochi/Bangalore/Chennai/Pune/Noida/Hyderabad
Work Model - Hybrid
Mandatory Skills - Snowflake, PySpark
Must Have Skills :
Data Warehouse :
- Design, implement, and optimize data warehouses on the Snowflake platform.
- Ensure effective utilization of Snowflake features for scalable and high-performance data storage.
Data Pipeline Development :
- Develop, implement, and optimize end-to-end data pipelines on Snowflake.
- Create and maintain ETL workflows for seamless data processing.
Data Transformation with PySpark :
- Leverage PySpark for advanced data transformations within the Snowflake environment.
- Implement data cleansing, enrichment, and validation processes using PySpark.
Requirements :
- Proven experience as a Data Engineer, with a strong emphasis on Snowflake.
- Proficiency in Snowflake features and capabilities for data warehousing.
- Expertise in PySpark for data processing and analytics is a must.
- Experience with data modeling, ETL processes, and efficient data storage.
- Proficiency in programming languages such as Python, SQL, or Scala for data processing.
Skills :
- Snowflake, PySpark, Python, SQL
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1534935
Interview Questions for you
View All