Posted on: 10/03/2026
Company Overview :
Nazztec Private Limited is a leading technology solutions provider specializing in data engineering and analytics. We empower businesses across various sectors, including finance, healthcare, and retail, to unlock the full potential of their data through innovative and scalable solutions. Our expertise lies in building robust data pipelines, developing insightful dashboards, and implementing advanced analytics techniques to drive informed decision-making.
Role Overview :
As a Snowflake Developer at Nazztec, you will be instrumental in designing, developing, and maintaining our clients' data warehousing solutions on the Snowflake platform. You will collaborate closely with data engineers, data scientists, and business stakeholders to understand their data requirements and translate them into efficient and scalable data models and ETL pipelines.
Your work will directly impact our clients' ability to gain valuable insights from their data, improve operational efficiency, and make data-driven decisions.
Key Responsibilities :
- Design and develop scalable and efficient data models on the Snowflake platform to meet business requirements.
- Build and maintain robust ETL pipelines using Apache Airflow and other relevant tools to ingest, transform, and load data into Snowflake.
- Write complex SQL queries and stored procedures to extract, manipulate, and analyze data within Snowflake.
- Implement data quality checks and monitoring processes to ensure data accuracy and reliability.
- Collaborate with data engineers and data scientists to optimize data pipelines and improve query performance.
- Develop and maintain CI/CD pipelines for automated deployment of Snowflake objects and data pipelines.
- Troubleshoot and resolve data-related issues in a timely manner to minimize business impact.
Required Skillset :
- Demonstrated expertise in designing and developing data warehousing solutions using Snowflake DB.
- Proven ability to build and maintain ETL pipelines using Apache Airflow or similar orchestration tools.
- Strong proficiency in writing complex SQL queries and stored procedures.
- Experience with Python for data manipulation and automation.
- Familiarity with cloud platforms such as AWS or Azure.
- Understanding of CI/CD principles and experience with implementing automated deployment pipelines.
- Excellent communication and collaboration skills to work effectively with cross-functional teams.
- Bachelor's degree in Computer Science or a related field.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1619164