HamburgerMenu
hirist

PibyThree - Snowflake Developer - PL-SQL

PibyThree
Multiple Locations
4 - 6 Years
star-icon
4.6white-divider32+ Reviews

Posted on: 21/11/2025

Job Description

Role Overview :

We are seeking a skilled Snowflake Data Engineer with strong experience in Snowflake Data Cloud, PL/SQL, and modern data engineering practices. The ideal candidate will work on designing, building, and optimizing data pipelines, ETL workflows, and Snowflake data environments. The role requires deep expertise in SQL/PLSQL, cloud platforms, Snowflake features, and data ingestion frameworks.

Key Responsibilities (KRA) :

- Designing, developing, and managing end-to-end data pipelines on Snowflake

- Implementing data ingestion processes using formats such as CSV, JSON, Parquet, and Avro

- Writing advanced SQL, PL/SQL, SnowSQL, and stored procedures for data transformation

- Performing Snowflake staging (internal/external) and implementing efficient loading strategies

- Optimizing Snowflake performance through query tuning and data model improvements

- Troubleshooting failures, root-cause analysis, and resolving technical issues across ETL workflows

- Developing ETL routines using Python, Scala, PySpark, or ETL tools

- Collaborating with cloud teams to manage AWS/Azure/GCP environments supporting Snowflake workflows

- Evaluating and improving existing staging, source, and reporting data structures

- Ensuring high data quality, reliability, and governance across all data assets

- Implementing dbt models and managing transformations within the data pipeline

- Creating documentation for data flows, transformation logic, and Snowflake architecture

- Working with cross-functional teams including data analysts, BI developers, and cloud engineers

- Ensuring best practices in data security, orchestration, and cloud integration

Required Skillsets :

- Strong hands-on experience with Snowflake Data Cloud

- Expertise in SQL, advanced PL/SQL, and Oracle database programming

- Experience with SnowSQL, stored procedures, and Snowflake-specific features

- Knowledge of internal and external Snowflake staging and data loading options

- Proficiency in data ingestion from multiple file formats (CSV, JSON, Parquet, Avro, etc.)

- Strong understanding of ETL development using Python, Scala, or PySpark

- Experience with AWS, Azure, or GCP cloud platforms

- Knowledge of dbt for data transformation and modeling

- Experience in SQL performance tuning and root-cause analysis

- Familiarity with modern data architectures and data modeling concepts

- Strong problem-solving skills and the ability to troubleshoot complex pipelines

- Experience evaluating and improving existing data structures

- Good communication, documentation, and teamwork abilities

Qualifications :

- Bachelor's degree in Computer Science, Engineering, Information Systems, or equivalent

- 4 - 7 years of experience in data engineering roles

- Minimum 2 years hands-on experience with Snowflake

- Strong hands-on experience with PL/SQL and SQL development

- Experience working with cloud environments such as AWS, Azure, or GCP

info-icon

Did you find something suspicious?