We are seeking a highly skilled Data Engineer with strong hands-on experience in Google Cloud Platform (GCP), BigQuery, PL/SQL, and Unix Shell Scripting. The ideal candidate will have 5 - 8 years of experience working with large-scale data systems, building efficient data pipelines, and implementing ETL processes on cloud-based platforms.
Key Responsibilities :
- Design, develop, and maintain scalable and efficient data pipelines on Google Cloud Platform (GCP) using BigQuery.
- Write optimized and complex PL/SQL queries, stored procedures, and packages for data processing and reporting.
- Develop and automate workflows using Unix/Linux Shell Scripting to manage data ingestion and transformation tasks.
- Collaborate with data architects, analysts, and business teams to gather requirements and ensure data solutions align with business goals.
- Perform data validation, quality checks, and troubleshoot performance bottlenecks.
- Monitor and optimize cloud-based data infrastructure for reliability, performance, and cost-effectiveness.
- Ensure data security, compliance, and best practices in data handling on GCP.
Required Skills :
- 5 - 8 years of overall IT experience with a strong focus on data engineering.
- Hands-on expertise in Google Cloud Platform (GCP) services - especially BigQuery, Cloud Storage, Dataflow, etc.