HamburgerMenu
hirist

Zywave - Technical Lead - Data Engineering

Zywave
Pune
8 - 10 Years

Posted on: 03/09/2025

Job Description

Job Description :


Role : Technical Lead (Data Engineering, TurboRater RQR : CPQ Rating).

Location : Pune, India.

Work Mode : 5 Days Work from Office.

About the Role :


Zywave is seeking a Technical Lead - Data Engineering (TurboRater RQR : CPQ Rating) with expertise in Snowflake, SQL Server, and modern data architecture principles including Medallion Architecture and Data Mesh.

This role will play a critical part in the TurboRater RQR : CPQ Rating initiative, leading the design and implementation of scalable, secure, and high-performance data pipelines that power rating and CPQ (Configure, Price, Quote) capabilities.

The ideal candidate will combine deep technical expertise with insurance domain knowledge to drive innovation and deliver business impact.

Key Responsibilities :


- Lead end-to-end design and development of data pipelines supporting TurboRater RQR : CPQ Rating.


- Architect and implement Medallion Architecture (Bronze, Silver, Gold layers) for structured and semi-structured data.

- Drive adoption of Data Mesh principles, decentralizing ownership and promoting domain-oriented data products.

- Collaborate with business/product teams to align CPQ Rating requirements with scalable technical solutions.

- Ensure data quality, lineage, and governance across rating-related data assets.

- Optimize workflows for rating performance, scalability, and cost-efficiency.

- Mentor and guide engineers working on TurboRater initiatives.

- Stay updated with data & insurtech innovations relevant to CPQ and rating platforms.

Qualifications :


- Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.

- 8+ years of experience in data engineering with at least 3 years in technical leadership.

- Strong hands-on experience with Snowflake (data ingestion, transformation, performance tuning).

- Proficiency in SQL Server and T-SQL.

- Deep understanding of Medallion Architecture and Data Mesh principles.

- Experience with data orchestration tools (Airflow, dbt), cloud platforms (Azure/AWS), and CI/CD pipelines.

- Strong leadership and problem-solving skills.

- Knowledge of Python or Scala for data processing.

- Exposure to real-time data streaming (Kafka, Spark Streaming).

Mandatory Skills :


- GIT.

- Snowflake.

- ELT tools.

- SQL Server.

- NET.

- CPQ Rating / TurboRater exposure preferred.

Good to Have Skills :


- Prompt Engineering.

- Kafka.

- Spark Streaming.

- AWS.

- dbt

info-icon

Did you find something suspicious?