HamburgerMenu
hirist

Group Head - Data Engineering

HUDSONMANPOWER PRIVATE LIMITED
Mumbai
7 - 12 Years

Posted on: 02/09/2025

Job Description

Key Responsibilities :


- Lead technical implementation of data pipeline and migration requirements, maximizing platform capabilities to support business stakeholder objectives.

- Interface directly with stakeholders to gather requirements and deliver automated, end-to-end data engineering solutions.

- Design and implement robust data pipelines to automate ingestion, transformation, and augmentation of structured, unstructured, and real-time data.

- Drive continuous improvement by identifying, designing, and implementing process automation, optimizing data delivery, and re-architecting infrastructure for scalability.

- Troubleshoot and resolve data quality issues flagged by pipeline monitoring or downstream consumers.

- Enforce Data Governance best practices, including functional and technical impact analysis.

- Provide technical guidance and innovative ideas to improve data systems and solutions.

- Create and maintain clear documentation on data models, schemas, and transformation/validation rules.

- Implement tools to enable faster data extraction, analysis, and visualization for data consumers.

- Lead full software development lifecycle for batch ETL processes, including hands-on development, code reviews, testing, deployment, and documentation.

- Collaborate closely with internal product and technical teams to ensure seamless integration of data infrastructure.

- Migrate existing data applications and pipelines to cloud platforms (AWS), leveraging Platform-as-a-Service (PaaS) solutions.

Required Skills and Qualifications :


- 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL development, data warehousing, data mart, data lake, Big Data, and AWS Cloud.

- Strong knowledge of Data Governance principles and best practices.

- Proven ability to design, build, and maintain scalable, automated data pipelines handling diverse data types and volumes.

- Expertise in troubleshooting and resolving complex data quality issues.

- Experience with cloud migration and cloud-native data solutions, particularly AWS PaaS offerings.

- Strong problem-solving skills and ability to work closely with stakeholders to understand and translate business needs into

technical solutions.

- Excellent documentation and communication skills.

- Familiarity with software development lifecycle, code reviews, testing, and deployment in an Agile environment.

- Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or related fields.

- Experience with modern data engineering tools and frameworks such as Apache Spark, Kafka, Airflow, or similar.


info-icon

Did you find something suspicious?