Posted on: 06/10/2025
About the Role :
LanceSoft is seeking an experienced Senior Snowflake / DBT Data Engineer to design, build, and optimize modern data pipelines and warehouse solutions in a cloud-native environment.
The ideal candidate will have strong hands-on expertise with Snowflake, DBT, and Cloud orchestration tools, ensuring scalable, reliable, and high-performance data solutions that empower analytics and business decision-making.
Key Responsibilities :
- Design, develop, and maintain efficient ETL pipelines using Snowflake, DBT, Databricks, Cloud Composer (Airflow), and Cloud Run.
- Ingest and transform structured and unstructured data from on-premises SQL Server into Snowflake Models, Databases, and Schemas.
- Implement validation checks, logging, and error-handling frameworks to ensure data quality and reliability.
- Continuously optimize query and pipeline performance through partitioning, clustering, and cost-efficient design techniques.
- Develop automated CI/CD pipelines using tools such as Jenkins, GitHub Actions, and Infrastructure-as-Code (IaC) frameworks.
- Establish data observability and proactive monitoring with Cloud Logging and Monitoring, ensuring timely alerting and response.
- Collaborate with data analysts, architects, and business teams to define requirements and implement scalable data models.
- Document workflows, maintain best practices, and contribute to the continuous improvement of the data engineering ecosystem.
Basic Qualifications :
- 6+ years of overall Data Engineering experience.
- 4+ years of hands-on experience with Snowflake and DBT in production environments.
- 3+ years of experience working with Cloud Composer (Airflow) and BigQuery, including complex SQL and performance tuning.
- 4+ years of ETL development in Python, with experience building parallel pipelines and modularized codebases.
- Strong knowledge of data modeling, performance tuning, and pipeline troubleshooting.
- Expertise in diagnosing and resolving query, data quality, and orchestration issues.
- Excellent communication skills and ability to collaborate effectively with cross-functional teams.
Preferred Skills :
- Experience with Databricks, Azure Data Factory, or GCP Dataflow.
- Familiarity with CI/CD automation and IaC tools such as Terraform or Cloud Deployment Manager.
- Understanding of data governance, lineage, and metadata management best practices.
- Exposure to real-time data streaming tools (Kafka, Pub/Sub, etc.) is a plus
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1555950
Interview Questions for you
View All