HamburgerMenu
hirist

Job Description

Role Overview :

As a GCP Data Engineer, you will be instrumental in designing, building, and maintaining robust data pipelines and infrastructure on the Google Cloud Platform (GCP). You will collaborate closely with data scientists, analysts, and other engineers to ensure the efficient and reliable flow of data for various analytical and operational needs. Your work will directly impact the ability of our clients to gain valuable insights from their data, improve business processes, and drive innovation.


Key Responsibilities :

- Design and implement scalable and reliable data pipelines on GCP using services like Dataflow, Dataproc, BigQuery, and Cloud Storage to ingest, process, and transform large datasets for analytical and reporting purposes.


- Develop and maintain data models and schemas in BigQuery to ensure data quality, consistency, and accessibility for downstream users.


- Automate data ingestion, processing, and validation workflows using scripting languages like Python and orchestration tools like Apache Airflow or Cloud Composer to improve efficiency and reduce manual effort.


- Monitor and troubleshoot data pipeline performance, identify bottlenecks, and implement optimizations to ensure data is processed in a timely and cost-effective manner for business stakeholders.


- Collaborate with data scientists and analysts to understand their data requirements and provide them with the necessary data infrastructure and tools to perform their analyses effectively, enabling data-driven decision-making.


- Implement data security and governance policies on GCP to protect sensitive data and ensure compliance with relevant regulations, safeguarding client data integrity and confidentiality.


Required Skillset :

- Demonstrated ability to design, build, and maintain data pipelines on Google Cloud Platform (GCP) using services like Dataflow, Dataproc, BigQuery, and Cloud Storage.


- Proven expertise in data modeling, schema design, and data warehousing concepts, ensuring data quality and accessibility.


- Proficiency in scripting languages like Python and experience with orchestration tools like Apache Airflow or Cloud Composer for automating data workflows.


- Strong understanding of data security and governance principles, with the ability to implement security measures on GCP to protect sensitive data.


- Excellent communication and collaboration skills to work effectively with data scientists, analysts, and other engineers.


- Bachelor's or Master's degree in Computer Science, Engineering, or a related field.

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in