Posted on: 27/01/2026
Description :
- Expert-level proficiency in SQL and relational database concepts
- Strong programming experience in Python for data pipeline development and automation
- Deep hands-on experience in dimensional data modeling, including star schema, snowflake schema, fact tables, and dimension tables
- Knowledge of slowly changing dimensions (SCDs), surrogate keys, grain definition, and hierarchical dimensions
- Experience designing, building, and maintaining ETL/ELT pipelines for production analytics and AI/ML data workflows
- Proven ability to lead technical initiatives and influence cross-functional teams
- Strong analytical, problem-solving, and communication skills
It would be great if you also have :
Segment)
- Experience implementing dimensional models using dbt or similar transformation frameworks
- Knowledge of event-driven or semi-structured data (JSON, logs, clickstream)
What will you be doing in this role?
- Lead the design and implementation of scalable, reliable data pipelines supporting product analytics, user behavior tracking, and AI/ML initiatives
- Own enterprise-level dimensional data modeling, including star schema and snowflake schema designs
- Define and enforce standards for data modeling, naming conventions, and ETL/ELT best practices across teams
- Provide strategic input on data architecture, ensuring alignment with product and AI/ML use cases
- Familiarity with modern data engineering tools and orchestration frameworks (dbt, Airflow, Fivetran, Segment)
- Establish monitoring, testing, and alerting frameworks for pipeline performance, data quality, and data freshness
- Ensure integrity of fact and dimension data through reconciliation, validation, and automated quality checks
- Troubleshoot and resolve complex data issues impacting analytics, AI workflows, or product features
- Partner with Product Managers, Data Scientists, Analysts, and Software Engineers to translate business and AI requirements into high-quality data models and pipelines
About the Team :
- Full-time, IST
- 40 hours per week
- Hybrid working environment
Did you find something suspicious?
Posted by
Sandhya Upadhyay
Associate TA Partner at Clarivate Analytics India Private Limited
Last Active: 28 Jan 2026
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1606309