Posted on: 02/12/2025
Job Description :
- Design, develop, test, and deploy scalable and reliable data processing pipelines using Java 17+ and Apache Beam, executed on GCP Cloud Dataflow.
- Build and manage data orchestration workflows using Apache Airflow or GCP Cloud Composer, including creating and maintaining DAGs with common and custom operators.
- Work extensively with GCP Big Data services such as BigQuery, BigTable, Cloud SQL, and Google Cloud Storage (GCS) for data ingestion, transformation, storage, and optimization.
- Write, optimize, and review complex SQL queries for data retrieval and analysis, particularly within BigQuery and Cloud SQL.
- Maintain high standards of code quality, testing, documentation, and best practices.
- Collaborate with cross-functional teams (Data Engineers, Cloud Architects, Developers) to deliver robust and scalable cloud solutions.
Mandatory Skills :
- Strong knowledge of GCP
- Proficiency in Java 17+
- Experience with GCP Cloud Dataflow (Apache Beam SDK)
- Hands-on with Airflow/Composer - creating and managing DAGs, operators
- Strong understanding of GCP Big Data services - BigQuery, BigTable, GCS, Cloud SQL
- Excellent SQL query writing and optimization skills
Good-to-Have Skills :
- Experience with GKE (Google Kubernetes Engine)
- Understanding of IAM (Identity and Access Management)
- Exposure to Cloud Spanner
- Experience developing secure API endpoints
- Knowledge of containerization and microservices
Notice period : 0 to 15days only
Bachelor's degree in Computer Science, Engineering, or related field.
Skills : Gcp, Java, Cloud dataflow
Did you find something suspicious?
Posted By
Posted in
DevOps / SRE
Functional Area
DevOps / Cloud
Job Code
1583582
Interview Questions for you
View All