Posted on: 07/10/2025
Job Summary :
We are looking for an experienced Data Engineer with strong hands-on expertise in Google Cloud Platform (GCP) and BigQuery to design, build, and manage scalable data pipelines. The ideal candidate will have a deep understanding of ETL frameworks, data modeling, and cloud-based analytics solutions.
Key Responsibilities :
- Design, develop, and maintain scalable data pipelines and ETL workflows on GCP.
- Work extensively with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Composer (Airflow).
- Optimize data storage, query performance, and cloud cost efficiency.
- Collaborate with data scientists, analysts, and business teams to deliver clean, structured datasets.
- Implement data quality checks, lineage, and governance across data platforms.
- Automate data workflows using Python, SQL, and orchestration tools.
- Integrate data from multiple sources, including APIs, on-prem databases, and third-party applications.
- Ensure security, scalability, and reliability in all data architecture designs.
Required Skills & Experience :
- Hands-on experience with Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage)
- Proficiency in SQL and Python
- Strong understanding of ETL concepts, data modeling, and data warehousing
- Familiarity with CI/CD pipelines (Git, Jenkins, etc.)
- Experience with workflow orchestration tools such as Apache Airflow or Cloud Composer
- Working knowledge of DataOps and DevOps best practices
- Excellent problem-solving, debugging, and communication skills
Good to Have :
- Experience with Snowflake, Databricks, or Terraform
- Knowledge of machine learning pipelines or streaming data (Kafka)
- Exposure to data visualization tools (Tableau, Power BI, Looker)
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1556703
Interview Questions for you
View All