Posted on: 18/07/2025
Job Description :
curatAId is seeking a Senior Snowflake Consultant on behalf of our client, a fast-growing organization focused on datadriven innovation. This role combines snowflake expertise with DevOps, DBT, Airflow to support the development and operation of a modern, cloud-based enterprise data platform. The ideal candidate will be responsible for building and managing data infrastructure, developing scalable data pipelines, implementing data quality and governance frameworks and automating workflows for operational efficiency. To apply for this position, it is mandatory to register on our platform at www.curataid.com and give 10 minutes technical quiz on Snowflake skill.
Title : Senior Data Engineer
Level : Consultant/Deputy Manager/Manager/Senior Manager
Relevant Experience : Minimum of 5+ years of hands-on experience on Snowflake with DevOps, DBT, Airflow
Must Have Skill : Data Engineering, Snowflake, DBT, Airflow & DevOps
Location : Mumbai, Gurgaon, Bengaluru, Chennai, Kolkata, Bhubaneshwar, Coimbatore, Ahmedabad
Qualifications :
- 5+ years of relevant snowflake in a data engineering context. (Must Have)
- 4+ years of relevant experience in DBT, Airflow & DevOps. (Must Have)
- Strong hands-on experience with data modelling, data warehousing and building high-volume ETL/ELT pipelines.
- Must have experience with Cloud Data Warehouses like Snowflake, Amazon Redshift, Google Big Query or Azure Synapse.
- Experience with version control systems (GitHub, BitBucket, GitLab).
- Strong SQL expertise.
- Implement best practices for data storage management, security, and retrieval efficiency.
- Experience with pipeline orchestration tools (Fivetran, Stitch, Airflow, etc.).
- Coding proficiency in at least one modern programming language (Python, Java, Scala, etc.).
What You'll Do :
- Data Pipeline Development : Design, develop, and maintain high-volume ETL/ELT pipelines, ensuring data quality, efficiency, and reliability.
- Snowflake Expertise : Leverage your deep expertise in Snowflake to build and optimize data warehouses, implementing best practices for data storage, security, and retrieval efficiency.
- Orchestration & Transformation : Utilize DBT (Data Build Tool) for data transformation and modeling, and Airflow for orchestrating complex data workflows.
- DevOps Integration : Implement and maintain DevOps practices for data engineering, ensuring seamless integration, continuous delivery, and automated deployments.
- Cloud Data Warehousing : Work with various cloud data warehouses, including Snowflake, and potentially Amazon Redshift, Google BigQuery, or Azure Synapse.
- Version Control : Collaborate effectively using version control systems such as GitHub, BitBucket, or GitLab.
- SQL Mastery : Write complex and efficient SQL queries for data manipulation, analysis, and reporting.
- Coding Proficiency : Develop and integrate solutions using at least one modern programming language (e.g., Python, Java, Scala).
- Best Practices : Advocate for and implement best practices in data modeling, data warehousing, and pipeline development.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1514566
Interview Questions for you
View All