HamburgerMenu
hirist

Wipro - Data Engineer - Google Cloud Platform

Wipro Enterprises
Bangalore
3 - 5 Years

Posted on: 15/10/2025

showcase-imageshowcase-imageshowcase-image

Job Description

About the company :

Founded in 1945 as a Vegetable Oil Company, Wipro Consumer Care & Lighting is one of the fastest growing FMCG companies with presence in 20 countries predominantly in India, Asia, Africa, and the Middle East. Our business includes personal wash products, skincare products, male grooming, toiletries, wellness products, household products, domestic and commercial lighting and modular office furniture. We have a turnover of USD 1.10 billion, with more than 10,000 employees from 22 different nationalities. Our international business contributes more than 50% of our turnover.

About the Role :

We are seeking a talented Data Engineer to join our growing data team. The ideal candidate will have strong expertise in Google Cloud Platform (GCP), BigQuery, Python, and SQL, and will be responsible for designing and maintaining scalable data solutions that power business insights.

Key Responsibilities :

- Design, develop, and maintain robust data pipelines and ETL workflows.

- Manage large-scale datasets using GCP and BigQuery for optimal performance.

- Write clean, efficient Python and SQL code for data processing and transformation.

- Collaborate with data analysts, scientists, and business teams to ensure data quality and availability.

- Optimize workflows for scalability, cost-efficiency, and performance.

Key Skills & Requirements :

- 3- 4 years of professional experience as a Data Engineer.

- Hands-on expertise in GCP, BigQuery, Python, and SQL.

- Strong understanding of data warehousing concepts, ETL processes, and cloud architecture.

- Proficiency in data modeling, performance tuning, and query optimization.

- Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration tools (e.g., Airflow).

- Excellent problem-solving skills, attention to detail, and ability to work in cross-functional teams.

- Strong communication and collaboration skills.

Nice-to-Have :

- Experience with Apache Spark for distributed data processing.

- Knowledge of Kubernetes for container orchestration and deployment.

- Exposure to Machine Learning pipelines and integration with data workflows.

We work 5.5 days from office.

info-icon

Did you find something suspicious?