HamburgerMenu
hirist

HCL - Data Engineer - ETL/Data Warehousing

Posted on: 17/11/2025

Job Description

Responsibilities :


1. Design and implement scalable data pipelines using GCP services such as BigQuery, Dataflow, and Airflow.


2. Develop and maintain ETL processes to integrate various data sources into the data warehouse.


3. Collaborate with data analysts and business stakeholders to understand data requirements and deliver impactful data solutions.


4. Monitor and optimize data pipeline performance, ensuring reliability and efficiency.


5. Implement data quality checks and validation processes to maintain data integrity.


6. Create and manage data models using DBT to enable seamless data transformations.


7. Document processes, architectures, and data flows to enhance team knowledge sharing and compliance.


8. Stay updated with emerging technologies and industry trends in data engineering to recommend improvements and enhancements.


Requirements :


1. Bachelor's or master's degree in computer science, Engineering, or a related field.


2. 6 to 10 years of experience in data engineering or a related role, specifically with GCP.


3. Strong expertise in Big Query and its optimization techniques for large datasets.


4. Proficiency in Python for scripting and building data pipelines.


5. Experience with Apache Airflow for orchestrating complex workflows.


6. Hands-on experience with Dataflow for real-time data processing.


7. Familiarity with DBT for data transformation and modeling in the cloud environment.


info-icon

Did you find something suspicious?