HamburgerMenu
hirist

Job Description

Description :

Key Responsibilities :

- Collaborate with business and technology teams to understand current and future data requirements.

- Design, build, and maintain scalable data infrastructure for data collection, storage, transformation, and analysis.

- Develop and manage data pipelines, data models, and data workflows ensuring high performance and reliability.

- Build and optimize data platforms such as data warehouses, data lakes, and data lakehouses for both structured and unstructured data.

- Implement automation scripts and analytical tools to support data engineering processes.

- Ensure data quality, security, and performance through continuous monitoring and optimization.

Required Skills & Tools :

- Google Cloud Platform (GCP): BigQuery, Dataflow, Dataproc, Data Fusion, Cloud SQL

- Workflow Orchestration: Apache Airflow, Tekton

- Infrastructure as Code (IaC): Terraform

- Programming: Python, PySpark

- Database Technologies: PostgreSQL, SQL

- API Integration & Automation

Preferred Experience :

- Strong experience in BigQuery and end-to-end pipeline development.

- Proven track record in managing complex data workflows using Airflow and Tekton.

- Understanding of data security, governance, and optimization techniques.

info-icon

Did you find something suspicious?