HamburgerMenu
hirist

Holcim Global Digital Hub - AWS Data Engineer - Python/SQL/ETL

Posted on: 29/07/2025

Job Description

About The Role :


The Data Engineer will play an important role in enabling business for Data Driven Operations and Decision making in Agile and Product-centric IT environment.


Roles & Responsibilities :


- Design, develop, and maintain robust, scalable data pipelines and ETL workflows to support various business use cases.


- Build and optimize data ingestion processes from multiple structured and unstructured data sources.

- Collaborate with data scientists, analysts, and product teams to understand data requirements and deliver reliable datasets.

- Develop and maintain data models and schemas for efficient storage and retrieval in Data Lakes and Data Warehouses.

- Implement data quality checks and monitor pipeline performance to ensure data accuracy and system reliability.

- Optimize SQL queries and PySpark jobs to improve performance on big data platforms.

- Manage and automate workflows using tools such as Airflow, Azkaban, or Luigi to ensure timely data delivery.

- Participate in architecture discussions and propose scalable solutions leveraging cloud-native technologies like AWS Redshift, Glue, Lambda, or GCP equivalents.

- Troubleshoot and resolve production issues related to data pipelines and integrations.

- Document data engineering processes, standards, and best practices.


Education & Qualifications :

- Bachelors degree in Engineering (BE/B.Tech) from IIT or Tier I/II colleges.

- Certification in Cloud Platforms such as AWS or GCP is highly preferred.


Experience :


- 4 to 8 years of total professional experience.


- Strong hands-on experience in Python coding is a must.

- Proven experience in data engineering, including handling laudatory accounts or similar responsibilities.

- Hands-on experience with Big Data cloud platforms such as AWS Redshift, Glue, Lambda, Data Lakes, and Data Warehouses.

- Expertise in data integration and building robust data pipelines.

- Proficient in SQL and writing code for Spark engines using Python and PySpark.

- Experience with data pipeline and workflow management tools like Azkaban, Luigi, Airflow, or similar.


Key Personal Attributes :

- Business focused, Customer & Service minded

- Strong Consultative and Management skills

- Good Communication and Interpersonal skills


info-icon

Did you find something suspicious?