HamburgerMenu
hirist

Technical Lead - Data Platform

Zywave
Pune
8 - 12 Years

Posted on: 14/11/2025

Job Description

Description :



Position : Technical Lead Data Engineering (Snowflake, SQL Server, Data Architecture).

Location : Kharadi, Pune (Work from Office 5 Days a Week).

Shift Timing : 12 PM 9 PM IST.

Experience : 8 to 12 Years.

Immediate Joiners Preferred.

About the Role :

We are looking for a Technical Lead with deep expertise in Snowflake, SQL Server, and modern data architecture principles (Medallion Architecture, Data Mesh) to lead the design and implementation of scalable, secure, and high-performance data solutions.

The ideal candidate will be a strategic thinker with strong technical leadership skills and a passion for building data-driven ecosystems that empower business insights.

Key Responsibilities :

- Lead the end-to-end design and development of robust data pipelines using Snowflake and SQL Server.

- Architect and implement Medallion Architecture (Bronze, Silver, Gold layers) for structured and semi-structured data.

- Drive the adoption of Data Mesh principles to promote domain-oriented, decentralized data ownership.

- Collaborate with data analysts, scientists, and business teams to translate requirements into scalable solutions.

- Ensure data quality, governance, and lineage across all data assets.

- Optimize data workflows for performance, scalability, and cost efficiency.

- Mentor and guide data engineers, fostering a culture of technical excellence and innovation.

- Stay current with emerging data engineering technologies and recommend continuous improvements.

Qualifications & Skills :

Education: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.

Experience: 812 years in data engineering, including 3+ years in a technical leadership role.

Core Skills :

- Strong hands-on experience with Snowflake (data ingestion, transformation, performance tuning).

- Advanced proficiency in SQL Server and T-SQL for complex data operations.

- Deep understanding of Medallion Architecture and Data Mesh principles.

- Experience with ELT/ETL tools, Git, and CI/CD pipelines.

- Familiarity with data orchestration tools (Airflow, dbt) and cloud platforms (AWS or Azure).

- Strong analytical, problem-solving, and leadership abilities.

Good to Have :

- Experience with Kafka, Spark Streaming, AWS, dbt, or Prompt Engineering.

- Proficiency in Python or .NET for data processing.

- Insurance domain knowledge is highly preferred.


info-icon

Did you find something suspicious?