Posted on: 05/02/2026
Description :
ABOUT US :
TangoEye, company specializes in AI-powered video analytics, transforming CCTV data into actionable insights that help retailers understand customer shopping behaviors and evaluate staff efficiency. By Joining our team, youll have the opportunity to work at the cutting-edge technology, driving significant improvements in retail operations. We work cross platform and innovative work culture that fosters career growth and professional development.
JOB DESCRIPTION :
Position : Data Lead
Location : Chennai
Experience : 8-12 years
Employment Type : Full-time
Role Summary :
We are seeking an experienced Data Lead to design, build, and manage scalable data systems including data pipelines, data lakes, and enterprise-level data architecture. This role requires strong technical leadership, hands-on engineering expertise, and the ability to lead and mentor a high-performing data team.
Key Responsibilities :
- Lead the design, development, and optimization of end-to-end data pipelines (batch & real-time).
- Architect and maintain enterprise-scale data lakes and data platforms.
- Define and implement data structures, data models, and schema design.
- Design and maintain data architecture diagrams, flow diagrams, and technical documentation.
- Ensure best practices for data security, governance, quality, and performance.
- Own data integration from multiple source systems (APIs, DBs, SaaS, streaming platforms).
- Collaborate with product, engineering, analytics, and business stakeholders.
- Lead cloud data platform architecture (AWS / Azure / GCP).
- Drive cost optimization, performance tuning, and system scalability.
- Manage and mentor data engineers, analysts, and architects.
- Oversee sprint planning, delivery, and code quality standards.
- Ensure high availability, monitoring, and incident response for data platforms.
REQUIRED TECHNICAL SKILLS :
Strong hands-on experience with :
- Data Pipelines : Airflow, Dagster, AWS Glue, Azure Data Factory, Kafka
- Data Lakes : S3, ADLS, GCS, Delta Lake, Iceberg, Hudi
- Databases : Snowflake, BigQuery, Redshift, Synapse
- Programming : Python, SQL, Spark, Scala (good to have)
EXPERTISE IN :
- Data Modeling : Star schema, Snowflake schema, normalization/denormalization
- ETL/ELT frameworks
- Distributed systems & big data architectures
Strong knowledge of :
- Data Governance, Lineage, and Cataloging
- DevOps for data (CI/CD pipelines, Terraform, GitOps)
Leadership & Management Skills :
- Proven experience in leading and scaling data teams.
- Strong stakeholder management and communication skills.
- Ability to translate business requirements into technical architecture.
- Experience in project planning, delivery management, and performance reviews.
- Mentorship, coaching, and talent development.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1610054