HamburgerMenu
hirist

JSW One Platforms - Data Engineer - ETL/BigQuery

JSW Steel Ltd
Mumbai
2 - 5 Years
star-icon
3.9white-divider6,737+ Reviews

Posted on: 28/11/2025

Job Description

Role Overview :

We are looking for a mid-level Data Engineer to help build, maintain and evolve our data infrastructure, ensuring that business analysts and stakeholders across different lines of business have reliable, unified access to data - and enabling self-service reporting and dashboards. You will work closely with analytics, operations, and business stakeholders to transform raw data from multiple sources into actionable, high-quality datasets and reports.

Key Responsibilities :


- Build, maintain, and optimize data pipelines (batch and streaming, as needed) to - extract, transform, and load (ETL/ELT)- data from multiple, disparate source systems into a centralized data warehouse.

- Implement and manage data warehousing solutions using cloud-native technologies (e.g. on Google Cloud Platform - data warehouse like BigQuery, data ingestion/orchestration tools, etc.).

- Integrate data from diverse operational platforms (for different lines of business/departments) to create a unified, analytics-ready dataset.

- Perform data modeling and schema design to support reporting and analytics needs (e.g. star/snowflake schemas, dimensional models) and optimize storage and query performance.

- Ensure data quality, consistency, and reliability: implement validation/cleaning/transformation logic, monitor pipelines, handle issues, and maintain data governance and compliance processes.

- Work with business analysts, stakeholders and reporting/BI teams to understand business reporting requirements, translate them into data solutions, and deliver datasets for dashboards.

- Collaborate with BI/analytics/reporting tools and teams - support creation of dashboards and reports for various departments (e.g. sales performance, operations, executive dashboards, order management, customer service).

- Document data pipelines, data models, data flows, business logic and data definitions to help maintainability and team knowledge sharing.

- Help optimize data processing workflows for performance and cost-efficiency, including efficient querying and storage.

Required Qualifications & Skills :

- Bachelor's degree in Computer Science, Engineering, or related technical discipline (or comparable experience).

- 2- 5 years of practical experience in data engineering, ETL/ELT pipelines, or related backend/data roles.

- Strong proficiency in - SQL- and a programming/scripting language such as - Python- (or similar).

- Experience with data warehousing platforms and cloud-based data infrastructure (preferably cloud-native warehouses or lakehouses).

- Solid understanding of data modeling, schema design (dimensional models), data normalization/denormalization, and data transformation concepts.

- Experience developing and maintaining ETL/ELT pipelines, working with both batch and (optionally) streaming datasets.

- Strong problem-solving skills, attention to detail, and commitment to data quality, consistency and reliability.

- Good communication and collaboration skills - able to liaise with non-technical stakeholders (business analysts, operations, leadership), understand requirements, and translate them into technical solutions.

info-icon

Did you find something suspicious?