HamburgerMenu
hirist

EXL - Data Engineer - Google Cloud Platform

hirist.tech
Gurgaon/Gurugram
4 - 8 Years

Posted on: 28/01/2026

Job Description

Note : If shortlisted, you will be invited for initial rounds on 7th February 2026 (Saturday) in Gurugram


Role : Data Engineer GCP


Job Description :


Summary :


- Design, Builds, and maintains the data infrastructure that supports the organization's data related initiatives


- Collaborates with cross-functional teams, including data scientists, and software engineers, to ensure the efficient and reliable processing, storage, and retrieval of Data.


- Develops scalable data pipelines, optimizes data workflows, and ensures the quality and integrity of the data.


What will you do :


- Design scalable and efficient data pipelines to extract, transform, and load data from various sources into data warehouse or data lakes.


- Develops data pipelines that enable efficient data storage, retrieval, and analysis.


- Implements Data validation and quality checks to identify and address data anomalies or errors.


- Design data warehousing and data lake solutions that facilitate data storage, retrieval, and analysis.


- Documents data engineering processes, workflows, and systems for reference and knowledge-sharing purposes.


- Implement data quality checks and validation processes to ensure the accuracy, completeness, and consistency of the data.


- Identifies opportunities to streamline data engineering processes, improve efficiency, and enhance the quality of deliverables.


- Provides guidance and mentorship to junior data engineers to help them develop their technical skills and grow in their role.


Required :


- 5+ Years of Experience with SQL, NoSQL


- 5+ Years of Experience with Python or a comparable scripting language


- 5+ Years of Experience with Data warehouses and infrastructure components


- 5+ Years of Experience with ELT/ETL and building high-volume data pipelines


- 5+ Years of Experience with reporting/Analytic tools


- 5+ Years of Experience with Query optimization, data structures, transformation, metadata,

dependency, and workload management


- 5+ Years of Experience with Big Data and cloud architecture


- 5+ Years of hands-on experience building modern data pipelines within a major cloud platform (GCP, AWS, Azure)


- 3 - 5 Years of Experience with development/scaling of apps on containerized environment


- 5+ Years of Experience with real-time and streaming technology


- 3+ year(s) of soliciting complex requirements and managing relationships with key stakeholders.


- 3+ year(s) of experience independently managing deliverables.


Preferred :


- Experience in designing and building data engineering solutions in cloud environments (preferably GCP)


- Experience with Git, CI/CD pipeline, and other Devops principles/best practices


- Experience bash shell scripts, UNIX utilities & UNIX Commands


- ML/AI Experience is a plus


- Understanding of software development methodologies including waterfall and agile


- Ability to leverage multiple tools and programming languages to analyze and manipulation data knowledge of API development


- Experience with complex systems and solving challenging analytical problems


- Strong collaboration and communication skills within and across team knowledge with schema design and dimensional data modeling


- Google Professional Data Engineer Certification


- Knowledge of microservices and SOA


- Experience designing, building, and maintaining data processing systems

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in