HamburgerMenu
hirist

Job Description

Job Title : Full Stack Data Engineer

Location : Mangalore / Udupi

Employment Type : Direct, Permanent

Mode : Hybrid

Skills & Technologies :

- Programming : Python, SQL, Shell scripting

- GCP Data Stack

- Storage : BigQuery, Cloud Storage

- Processing : Dataflow (Apache Beam), Dataproc (Spark), Cloud Composer (Airflow)

- Streaming : Pub/Sub

- APIs & Services : Cloud Run, Cloud Functions, API Gateway

- Monitoring : Cloud Logging, Cloud Monitoring, Stackdriver, Prometheus/Grafana

- DevOps : Terraform, Git, Docker, Kubernetes, Cloud Build / Jenkins

- Data Visualization : Looker Studio, Power BI, Tableau

- Version Control & CI/CD : GitHub, GitLab, Bitbucket pipelines

Key Responsibilities :

- Build and maintain batch and streaming data pipelines on GCP.

- Develop ETL/ELT workflows using Airflow, Dataflow, and BigQuery.

- Manage data lakes and warehouses using BigQuery and Cloud Storage.

- Implement real-time data solutions using Pub/Sub and Dataflow.

- Build data APIs and microservices using Cloud Run and Cloud Functions.

- Ensure platform reliability, monitoring, and SLA/SLO compliance.

- Collaborate with DevOps on CI/CD, IaC, and automated monitoring.

- Participate in incident management, RCA, and change control.

- Mentor junior engineers and follow engineering best practices.

Key Competencies :

- Strong GCP, IAM, and cost management knowledge.

- Experience working under strict SLAs and ITIL processes.

- Understanding of data security, governance, and compliance.

- Good communication skills and ability to work in a 16/5 global support model.

Mandatory Skills :

- Python

- API Integration

- GCP

Good to Have :

- Google Professional Data Engineer certification

info-icon

Did you find something suspicious?