Posted on: 11/08/2025
Role : Data Engineer
Location : Remote, India
Type : Contract/C2H
Job Description :
Key Responsibilities :
- Design, develop, and maintain data pipelines using Azure Data Factory.
- Implement data models and structures in Snowflake to support analytics and reporting.
- Write complex SQL queries for data extraction, transformation, and loading (ETL).
- Develop and maintain Python scripts for data processing and automation.
- Collaborate with data scientists and analysts to understand data requirements and deliver solutions.
- Monitor and optimize data workflows for performance and reliability.
- Implement Snowpipe, Snowpark, and Snowflake Streams and Tasks for real-time data processing.
- Utilize Databricks and DBT for data transformation and orchestration.
Primary Skills :
- Azure Data Factory : 3-4 years of experience in designing and implementing data integration solutions.
- Snowflake : 3-4 years of experience in data warehousing and analytics.
- Data Modeling : 3-4 years of experience in designing data models for analytical purposes.
- SQL : 3-4 years of experience in writing and optimizing SQL queries.
- Python : 3-4 years of experience in data processing and automation.
Secondary Skills :
- Experience with Snowpipe for continuous data ingestion.
- Familiarity with Snowpark for data processing using Python.
- Knowledge of Snowflake Streams and Tasks for managing data changes.
- Experience with Databricks for big data processing.
- Proficiency in DBT for data transformation and modeling.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1528199
Interview Questions for you
View All