HamburgerMenu
hirist

Job Description

Description :


- We are seeking a skilled GCP Data Engineer / Developer to design, build, and optimize scalable data pipelines and applications on Google Cloud Platform (GCP). The ideal candidate will have hands-on experience with BigQuery, PySpark, and Airflow DAGs, with added advantage if they possess Java and ReactJS development experience for end-to-end data and application development.


Key Responsibilities :

- Design, develop, and maintain data ingestion and transformation pipelines on GCP using BigQuery, PySpark, and Airflow DAGs.

- Implement data processing and analytics workflows aligned with performance, scalability, and reliability standards.

- Collaborate with cross-functional teams (data architects, analysts, and application developers) to deliver integrated solutions.

- Optimize BigQuery SQL scripts for efficiency and cost management.

- Develop reusable data processing components and automation frameworks.

- (Optional) Contribute to frontend and backend development using ReactJS and Java when required.

- Troubleshoot and resolve data or application-related issues proactively.

- Follow Agile methodologies, participate in sprint planning, and deliver high-quality code.


Required Skills & Experience :


- 5+ years of experience in GCP-based data engineering.

- Strong expertise in BigQuery, PySpark, and Airflow (DAG development).

- Proficiency in SQL and data modeling concepts.

- Experience with CI/CD pipelines, Git, and version control.

- Good understanding of data integration, ETL, and orchestration frameworks.

- Excellent problem-solving and debugging skills.

- Strong communication and collaboration abilities.


Nice to Have :

- Experience with Java (Spring Boot) for backend development.

- Knowledge of ReactJS for UI development.

- Familiarity with Cloud Composer, Dataflow, or Pub/Sub.

- Exposure to DevOps practices and containerization (Docker/Kubernetes).


info-icon

Did you find something suspicious?