Posted on: 20/11/2025
About the Role:
We are seeking a highly skilled Senior Data Engineer with strong expertise in Snowflake, SQL, and Python to design, build, and optimize our data ecosystem. You will play a critical role in creating scalable data pipelines, ensuring high data quality, and supporting analytics and AI/ML initiatives across the organization. This position is ideal for someone who enjoys solving complex data problems and building enterprise-grade data platforms.
Key Responsibilities :
- Design, build, and maintain scalable ETL/ELT pipelines using Snowflake, Python, and related tools.
- Automate data ingestion from multiple structured and unstructured sources (APIs, databases, cloud storage, etc.).
- Implement pipeline monitoring, error handling, retry logic, and alerts.
- Develop and maintain Snowflake objects such as schemas, tables, views, streams, tasks, stages, and warehouses.
- Optimize Snowflake performance through warehouse sizing, clustering, micro-partitioning, and query tuning.
- Implement data sharing, secure views, role-based access models, and data governance practices.
- Write complex, high-performance SQL queries for data transformation and analytics.
- Design and implement data models (3NF, Star/Snowflake schema) that support BI and analytics use cases.
- Implement data quality frameworks, validation checks, and reconciliation processes.
- Use Python for data transformation, ingestion frameworks, API integration, and ETL automation.
- Build reusable components, libraries, and scripts for workflow orchestration.
- Set up continuous monitoring of pipelines, Snowflake performance, and data quality metrics.
- Ensure data privacy, compliance, and security using best practices (RBAC, masking policies, encryption).
- Troubleshoot complex data and performance problems.
- Partner with data analysts, BI developers, and data scientists to understand requirements and deliver reliable datasets.
- Collaborate with DevOps/Cloud teams on infrastructure provisioning, CI/CD, and deployment.
- Provide technical mentorship to junior engineers.
Required Skills & Experience :
- 5+ years of hands-on experience in Data Engineering, with at least 3+ years in Snowflake.
- Strong expertise in Snowflake SQL, Snowflake components, and performance tuning.
- Proficiency in Python for building scalable ETL/ELT workflows.
- Strong understanding of ETL processes, data integration patterns, and data warehousing concepts.
- Experience with data modeling (dimensional & relational).
- Knowledge of cloud platforms (AWS / Azure / GCP); familiarity with S3, Lambda, Databricks, or similar is a plus.
- Experience using workflow orchestration tools (Airflow, DBT, Prefect, etc.) is an advantage.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1577784
Interview Questions for you
View All