HamburgerMenu
hirist

GCP Data Engineer

QCENTRIO PRIVATE LIMITED
Multiple Locations
7 - 10 Years
star-icon
4.4white-divider13+ Reviews

Posted on: 17/12/2025

Job Description

Description :

Job Position : GCP Data Engineer

Experience : 6+ Yrs into Gcp Data Engineer

Location : Chennai, Pune, Trivandrum, Kochi

Notice Period : Immediate - 15 days only

JD- Key Responsibilities :

We are seeking a skilled Data Engineer to design, build, and maintain data pipelines and data models that support analytical and business intelligence needs. The ideal candidate will have hands-on experience with Python or SQL, Google Cloud Platform (GCP), and a strong understanding of data management, quality, and security best practices.

Key Responsibilities :

- Build and maintain moderately complex data pipelines, ensuring data flow, transformation, and usability for analytical projects.

- Design and implement data models, optimizing for performance and scalability.

- Apply knowledge of data characteristics and supply patterns to develop rules and tracking processes that support data quality models.

- Prepare data for analytical use by gathering, integrating, cleansing, and structuring data from multiple sources and systems.

- Perform design, creation, and interpretation of large and highly complex datasets.

- Troubleshoot pipeline and data issues to ensure accuracy and reliability.

- Stay up-to-date with GCP advancements and recommend innovative solutions.

- Implement security best practices within data pipelines and cloud infrastructure.

- Collaborate with global teams to share and adopt best practices in data management, maintenance, reporting, and security.

- Develop and execute data quality checks to ensure consistency and integrity.

- Work with credit data products and perform analysis using tools like Google BigQuery, BigTable, DataFlow, and Spark/PySpark.

Mandatory Skills :

- Python or SQL Proficiency : Experience with Python or SQL and intermediate scripting for data manipulation and processing.

- GCP & Cloud Fundamentals : Intermediate understanding and experience with Google Cloud Platform (GCP) and overall cloud computing concepts.

- Data Pipeline Construction : Proven ability to build, maintain, and troubleshoot moderately complex pipelines.

- Data Modeling & Optimization : Experience designing and optimizing data models for performance.

- Data Quality Governance : Ability to develop rules, tracking processes, and checks to support a data quality model.

- Data Preparation & Structuring : Skilled in integrating, consolidating, cleansing, and structuring data for analytical use.

- Security Implementation : Knowledge of security best practices in pipelines and cloud infrastructure.

- Big Data Analysis Tools : Hands-on experience with Google BigQuery, BigTable, DataFlow, Scala + Spark or PySpark.

- Advanced Data Formats : Experience working with JSON, AVRO, and PARQUET formats.

- Communication & Best Practices : Strong communication skills to promote global best practices and guide adoption.

Preferred Qualifications :

- Cloud certification (e.g., GCP Data Engineer, AWS, Azure).

- Experience with credit data products.

- Familiarity with data governance frameworks and metadata management tools.

Technical Skills :

Python | SQL | GCP | BigQuery | BigTable | DataFlow | Spark / PySpark | JSON | AVRO | PARQUET


info-icon

Did you find something suspicious?

Job Views:  
3
Applications:  2
Recruiter Actions:  1

Functional Area

Big Data / Data Warehousing / ETL

Job Code

1591709