Snowflake Engineer
Experience : 5- 10 years
Location : Anywhere in India
Key Responsibilities :
- Lead ETL/ELT development using tools such as SQL, Python, DBT, Airflow, Matillion, or Talend.
- Migrate complex T-SQL logic and stored procedures from SQL Server to Snowflake-compatible SQL or ELT workflows.
- Integrate AWS Glue to automate and orchestrate data workflows.
- Work with structured and semi-structured data formats (e.g., JSON, Parquet, Avro, XML).
- Optimize Snowflake performance and cost through effective query tuning and warehouse resource management.
- Design data models that support business intelligence and analytics use cases.
- Ensure high standards of data quality, validation, and consistency during migration and transformation processes.
- Enforce data governance, security, and access control policies to ensure compliance with organizational standards.
- Collaborate with data architects, business stakeholders, and analytics teams to understand requirements and deliver data solutions.
- Maintain up-to-date technical documentation including data flows, mapping specifications, and operational runbooks.
Required Skills & Qualifications :
- Hands-on experience with Snowflake and cloud-based data platforms (AWS preferred).
- Strong expertise in SQL and at least one scripting language (preferably Python).
- Experience with ETL/ELT tools like DBT, Airflow, Matillion, or Talend.
- Familiarity with AWS Glue and other cloud-native data services.
- Proven ability to work with semi-structured data.
- Solid understanding of data modeling, data warehousing concepts, and BI tools.
- Strong focus on performance tuning, data validation, and data quality.
- Excellent communication and documentation skills.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1516907
Interview Questions for you
View All