Posted on: 27/08/2025
Job Title : Snowflake Developer (Python, SQL, Snowpark)
Location : Bangalore
Experience Level : 7 - 8 years
Employment Type : Full-Time / Contract
Notice : 15 - 20 days
Key Responsibilities :
- Data Pipeline Development : Design, develop, and maintain robust, scalable, and high-performance data pipelines using a combination of Snowflake SQL, Python, and Snowpark.
- Snowpark Expertise : Utilize Snowpark (Python API) to build advanced data engineering and data science workflows directly within the Snowflake environment, leveraging its native processing capabilities.
- Data Transformation and Optimization : Perform complex data transformation, modeling, and optimization to support business intelligence and reporting needs. Tune queries and manage warehouse usage to optimize performance and reduce costs.
- Cloud Integration : Leverage Azure data services for data ingestion, orchestration, and observability, ensuring seamless integration with the Snowflake platform.
- Data Governance and Quality : Implement best practices for data governance, security, and data quality to maintain the integrity and reliability of data within Snowflake.
- Collaboration : Work with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
Required Skills & Experience :
- Snowflake Proficiency : Minimum of 7 years of hands-on experience in Snowflake development and administration.
- SQL & Python : Strong command of SQL for complex data manipulation, and solid proficiency in Python, especially in the context of data engineering.
- Snowpark : Proven working experience with Snowpark for developing data pipelines or analytical applications.
- Data Architecture : A deep understanding of data warehouse architecture, ELT/ETL processes, and cloud-based data platforms.
- Tools & Processes : Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
Preferred Qualifications :
- Cloud Platforms : Experience with major cloud platforms such as AWS, Azure, or GCP.
- Orchestration : Knowledge of orchestration tools like Airflow or dbt.
- Security : Familiarity with data security and role-based access control (RBAC) within the Snowflake environment.
- Certifications : Snowflake certifications are a plus.
- Tools : Knowledge of the DBT tool is highly desirable.
Soft Skills :
- Problem-Solving : Strong analytical and problem-solving skills with the ability to tackle complex data challenges.
- Collaboration : The ability to work both independently and as a contributing member of a collaborative team.
- Communication : Excellent communication and documentation skills to articulate technical concepts and processes effectively.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1535733
Interview Questions for you
View All