HamburgerMenu
hirist

Job Description

Description : Lead Data Engineer


Experience : 9-14 Years


Work Mode : Hybrid (Hyderabad - Hitech City)


Engagement : Contract-to-Hire (C2H)


Walk-In Drive Date : 6th December 2025 (9.30 am to 1.30 pm)


Hiring For : UK-based Banking Group - GCC Centre


Type : Contract-to-Hire (after 9 months Full time opportunity)


About the Role :


We are looking for an experienced Lead Data Engineer to design, develop, and optimize large-scale data pipelines and analytics platforms on Google Cloud Platform (GCP). The role involves leading end-to-end data engineering initiatives, ensuring high-quality, scalable, secure, and compliant data solutions for a global banking environment.


Key Responsibilities :


- Lead the design and development of ETL, batch, and streaming data pipelines.


- Build and optimize data processing workflows using Python, SQL, and Apache Beam.


- Develop scalable data architectures and perform data modelling and schema design to support analytics and reporting.


- Work extensively with BigQuery, Cloud Storage, CloudSQL, and other GCP-native services.


- Implement best practices in data governance, security, privacy, and compliance, aligned with banking/regulatory requirements.


- Integrate machine learning workflows with GCP ML tools, ensuring seamless data availability for ML models.


- Optimize pipeline performance, cost efficiency, and reliability across the data ecosystem.


- Collaborate with cross-functional teams including Data Scientists, Product Owners, and Business Stakeholders.


- Mentor junior engineers and contribute to engineering best practices and standards.


Required Skills & Experience :


- 9-14 years of hands-on experience in data engineering with strong expertise in GCP.


- Proven experience in building ETL, batch, and real-time streaming pipelines.


- Strong programming skills in Python and SQL.


- In-depth experience with Apache Beam, Dataflow, BigQuery, Cloud Storage, and CloudSQL.


- Expertise in data modelling, star/snowflake schemas, and analytical data store design.


- Good understanding of data governance, security controls, and compliance frameworks within cloud environments (preferably BFSI).


- Experience integrating or supporting machine learning pipelines on GCP.


- Strong knowledge of performance optimization for high-volume data pipelines.


- Excellent communication, leadership, and stakeholder management skills.


info-icon

Did you find something suspicious?