HamburgerMenu
hirist

Job Description

Manager, Data Engineering

Pune, Maharashtra, India

Full-time

Region : India

Position Overview :

We're seeking a Manager - Data Engineering to lead our growing data engineering team based in Pune. This leadership role will focus on building scalable, secure, and high-performance data platforms, pipelines, and products for our global, multi-tenant SaaS applications.

You will work closely with cross-functional teams including Product, Architecture, UX, QA, and DevOps to deliver robust data solutions that power products and reporting systems - especially in the areas of ETL pipelines, cloud-native data warehousing, and unstructured data lakes.

Key Responsibilities :

Team Leadership & Development :

- Lead, mentor, and retain a high-performing team of data engineers

- Conduct regular 1 : 1s, performance reviews, and growth planning

- Foster a collaborative team culture and instill best practices

Project & Delivery Management:

- Drive delivery of data solutions aligned with sprint and release goals

- Ensure on-time delivery, high code quality, and scalability

- Facilitate agile ceremonies : sprint planning, retrospectives, and stand-ups

Technical Execution & Architecture

- Architect and guide the development of scalable ETL/ELT pipelines

- Build and maintain data lake solutions using AWS tools to manage unstructured and semi-structured data

- Work with large-scale datasets from diverse sources including APIs, logs, files, and internal systems

- Optimize performance, security, and maintainability of data pipelines

- Promote usage of tools such as Snowflake, dbt, Python, and SQL

Data Governance & Best Practices

- Ensure adherence to internal coding standards and data security guidelines

- Implement best practices for data modeling, quality checks, and documentation

- Collaborate with architecture and infrastructure teams on cloud cost optimization and performance

Required Skills & Experience

- 12- 15 years in Data Engineering, Data Architecture, or similar roles

- Minimum 3 years in a leadership or managerial capacity

- Proven experience building robust ETL pipelines, preferably for multi-tenant SaaS platforms

- Strong technical hands-on expertise with :

1. AWS Services : S3, Glue, Lambda, Redshift, EMR

2. Data Platforms : Snowflake, dbt

3. Programming : Python, SQL

4. Unstructured Data Handling : Data lakes, JSON, XML, log data

- Expertise in SQL-based data warehousing and RDBMS systems

- Knowledge of CI/CD, version control (GitHub), and Agile/Scrum methodologies

- Ability to balance technical depth with stakeholder communication and delivery tracking

- Understanding of modern lakehouse architecture and tools like Apache Hudi, Iceberg, or Delta Lake.

- Good to Have

- Experience in product-based companies (SaaS, ESG, Supply Chain domains preferred)

- Familiarity with data security standards (e.g., GDPR, SOC2)

- Experience with orchestration tools (Airflow, Step Functions), data cataloging, or cost optimization

info-icon

Did you find something suspicious?