HamburgerMenu
hirist

Carrier Technologies - Technical Lead

Carrier Technologies India Limited
8 - 11 Years
Bangalore

Posted on: 17/03/2026

Job Description

Description :

Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.

About the role :

We are seeking an experienced Tech Lead Data to join our Unified Data & AI team, supporting enterprise data initiatives across Finance and Master Data domains. This pivotal role involves architecting, designing, and delivering scalable, reusable, and governed data products spanning AWS and GCP cloud platforms, with Snowflake as a core component. The platform integrates data from over 60 ERP systems (SAP ECC, Atlas, JDE, Baan, etc.), requiring strong harmonisation, robust governance, and scalable architecture.

The ideal candidate will possess advanced SQL and Python skills, hands-on experience with data quality tools (Ataccama or similar) and data catalogues (Atlan or similar), and the ability to translate complex requirements into production-ready, governed data solutions.

Key Responsibilities :

Data Solution Architecture (AWS & GCP) :

- Lead the design and technical architecture for data solutions across AWS and GCP, focusing on scalable ingestion, harmonisation, and consumption-ready layers.

- Architect solutions to address multi-ERP complexity and select cloud-native services optimising performance, cost, security, and maintainability.

- Advanced SQL proficiency is essential; Snowflake experience is a plus.

Data Product Development & Engineering :

- Deliver enterprise-grade data products (Customer, Vendor, Material, Finance) that are well-modelled, governed, reliable, and designed for reuse and scalability.

- Enforce high SQL standards (Snowflake-compatible), drive Python-based engineering, and conduct technical reviews of pipelines and models.

Data Quality & Governance :

- Lead the implementation of enterprise data quality frameworks using Ataccama.

- Implement data quality rules, controls, and remediation workflows.

- Drive metadata management and data catalogue adoption (Atlan or similar), ensuring governance, lineage, and stewardship are embedded in all data products.

Business Engagement & Requirement Translation :

- Serve as the bridge between business and engineering, translating complex business processes into scalable data models.

- Collaborate closely with stakeholders in Finance and Corporate teams, refine requirements, and ensure readiness for analytics, reporting, and AI/ML use cases.

Platform & Multi-Cloud Collaboration :


- Work across AWS, GCP, and Snowflake.

- Partner with platform architects, DevOps, and cloud operations to ensure solutions adhere to enterprise guardrails and influence platform evolution from a data product perspective.

Non-Functional Requirements :

- Ensure all solutions meet standards for performance, scalability, data security, reliability, cost efficiency (FinOps), and observability.

Stakeholder Communication & Governance :

- Represent the team in architecture and governance forums, provide technical estimations, communicate technical concepts to diverse audiences, and proactively identify and mitigate technical risks.

Required Qualifications :

- Minimum 8 years of experience in Data Engineering or Data Architecture roles.

- Mandatory hands-on experience with AWS and GCP (multi-cloud exposure required).

- Advanced SQL expertise (enterprise scale; Snowflake-compatible preferred).

- Strong Python programming skills.

- Hands-on experience with Ataccama

- Experience with Atlan or similar data catalog/governance tools.

- Expertise in data modelling (enterprise-grade).

- Proven experience designing scalable data pipelines and products.

- Experience building governed, reusable enterprise data models.

- Experience handling complex, multi-source ERP landscapes.

- Strong knowledge of data warehousing and analytics patterns.

- Experience with CI/CD, Infrastructure as Code, and DevOps practices.

- Experience supporting AI/ML workloads on data platforms.

- Track record as Tech Lead or Solution Owner in complex projects.

- Ability to balance architecture, scalability, and business needs.

Preferred Qualifications :

- Experience with Snowflake platform.

- Exposure to manufacturing, supply chain, or operational analytics.

- ERP experience (SAP ECC, JDE, Baan, etc.).

- Familiarity with Lakehouse or Data Product architectures.

- Experience promoting cloud-agnostic architecture principles.

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in