Posted on: 06/11/2025
Description :
Key Responsibilities :
- Collaborate with business and technology teams to understand current and future data requirements.
- Design, build, and maintain scalable data infrastructure for data collection, storage, transformation, and analysis.
- Develop and manage data pipelines, data models, and data workflows ensuring high performance and reliability.
- Build and optimize data platforms such as data warehouses, data lakes, and data lakehouses for both structured and unstructured data.
- Implement automation scripts and analytical tools to support data engineering processes.
- Ensure data quality, security, and performance through continuous monitoring and optimization.
Required Skills & Tools :
- Google Cloud Platform (GCP): BigQuery, Dataflow, Dataproc, Data Fusion, Cloud SQL
- Workflow Orchestration: Apache Airflow, Tekton
- Infrastructure as Code (IaC): Terraform
- Programming: Python, PySpark
- Database Technologies: PostgreSQL, SQL
- API Integration & Automation
Preferred Experience :
- Strong experience in BigQuery and end-to-end pipeline development.
- Proven track record in managing complex data workflows using Airflow and Tekton.
- Understanding of data security, governance, and optimization techniques.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
DevOps / Cloud
Job Code
1570447
Interview Questions for you
View All