HamburgerMenu
hirist

Data Engineer - Google Cloud Platform

Aliqan Services Private Limited
Multiple Locations
5 - 10 Years

Posted on: 13/07/2025

Job Description

We're Hiring : GCP Data Engineer

Contract : 6+ Months (Extendable)

Experience Level : 5+ Years

Location : Remote

About the Role :

Are you passionate about building modern data infrastructure on the cloud? Were looking for an experienced GCP Data Engineer who can design, develop, and maintain scalable data pipelines and workflows on Google Cloud Platform. This is a fantastic opportunity to work in a high-impact, fast-paced environment where your contributions directly influence decision-making and business outcomes.

As a key member of our engineering team, youll be responsible for architecting and implementing robust ETL/ELT pipelines, managing large-scale data processing systems, and ensuring end-to-end data quality and governance. Youll work collaboratively with data analysts, scientists, and business stakeholders to deliver scalable solutions that support strategic analytics and insights.

Key Responsibilities :

- Design, implement, and optimize scalable ETL/ELT pipelines using GCP-native tools.

- Build and manage streaming and batch data workflows with Dataflow, Pub/Sub, and BigQuery.

- Ensure data accuracy, security, and compliance across the entire platform.

- Develop solutions for data ingestion, transformation, orchestration, and quality control.

- Automate recurring data engineering tasks using Cloud Composer (Apache Airflow) and other tools.

- Collaborate with cross-functional teams including data scientists, BI analysts, and product teams.

- Monitor performance and proactively resolve issues to maintain data pipeline reliability.

- Contribute to the continuous improvement of data architecture, data governance, and operational efficiency.

Required Skills and Experience :

- 5+ years of experience as a Data Engineer with at least 2+ years on Google Cloud Platform (GCP).

- Strong hands-on experience with BigQuery, Cloud Composer (Airflow), Dataflow, Pub/Sub, and Cloud Storage.

- Proficiency in Python and SQL for data manipulation, transformation, and analysis.

- Deep understanding of data architecture, ETL/ELT patterns, and data modeling principles.

- Experience implementing CI/CD pipelines for data workflows and managing infrastructure as code.

- Familiarity with data security, access controls, and compliance best practices.

- Strong communication skills and ability to work in a collaborative, remote-first team environment.

Bonus Skills (Nice to Have) :

- Experience with Terraform or Deployment Manager for GCP infrastructure automation.

- Knowledge of machine learning pipelines, real-time analytics, or data observability tools.

- Exposure to Kafka, Looker, dbt, or other modern data stack tools.

- Background in finance, retail, or telecom data domains is a plus.

Why Work With Us ?

- Work remotely from anywhereflexible and remote-friendly

- Get involved in cutting-edge GCP projects

- Collaborative team culture with growth opportunities

- Potential for long-term extension based on performance

Interested ? Send your updated resume and take the next step in your data engineering journey!


info-icon

Did you find something suspicious?