HamburgerMenu
hirist

Snowflake Engineer - Data Build Tool

Lakshya Software Technologies Private Limited
Multiple Locations
7 - 12 Years
star-icon
4.2white-divider26+ Reviews

Posted on: 04/12/2025

Job Description

Description :


Role : Snowflake DBT Engineer total years of experience- 7+ years

Location : Pune/Mumbai/Chennai/Bangalore

Mandatory Skills :


- Snowflake- 6 years


- DBT-6 years


- Python-3 years


- Airflow : 2 years


- Azure cloud exp is good to have


Roles & Responsibilities :


- 5+ years of working experience in Snowflake, Databricks, DBT and Python on Azure/GCP/AWS in production environments and hands-on experience in developing moderate to complex ETL/ELT data pipelines


- Strong in Apache Spark and Delta Lake, strong SQL and Python abilities, experience with data engineering and ETL pipelines


- Advanced SQL in Snowflake (CTEs, Views, UDFs, Materialized Views) & DBT


- Partitioning, clustering, and performance tuning in Snowflake


- Streams, Tasks, and Snowpipe for real-time and incremental data pipelines


- Cost and query optimization (e.g., using result cache, pruning, compute credits control)


- DBT-Data Modeling, SQL Mastery,Jinja Templating, Workflow Optimization & Configuration, Data Loading and Cleaning


- Hands on Python


- 3+ years of hands-on experience with Cloud Composer (Airflow) developing DAGs


- 3+ years of hands-on experience with Databricks and the ability to resolve complex SQL query performance issues


- 4+ years of ETL Python development experience; experience parallelizing pipelines a plus


- Demonstrated ability to troubleshoot complex query, pipeline, and data quality issues


- Develop, test, and deploy robust ETL pipelines on Snowflake on MS Azure/AWS/GCP Platform using, but not limited to, Snowflake/dbt/Databricks, Cloud Composer, and Cloud Run to ingest structured/unstructured data from on-prem SQL Server instances into Snowflake Models/Database/Schema



- Implement data validation checks, error handling, and logging to ensure pipeline reliability


- Automate deployment workflows using CI/CD pipelines (GitHub Actions, etc.) and infrastructure-as-code (IaC) tools


- Monitor pipelines via Cloud Logging and Cloud Monitoring, implementing alerting for data latency or quality issuese able to explain the solutions approach on problems


info-icon

Did you find something suspicious?