HamburgerMenu
hirist

Job Description

Key Responsibilities :

Data Engineering & Development :


- Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data.


- Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer.

- Develop and optimize data architectures that support real-time and batch data processing.

- Build, optimize, and maintain CI/CD pipelines using tools like Jenkins, GitLab, or Google Cloud Build.

- Automate testing, integration, and deployment processes to ensure fast and reliable software delivery.

Cloud Infrastructure Management :


- Manage and deploy GCP infrastructure components to enable seamless data workflows.

- Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices.

Infrastructure Automation and Management :


- Design, deploy, and maintain scalable and secure infrastructure on GCP.

- Implement Infrastructure as Code (IaC) using tools like Terraform.

- Manage Kubernetes clusters (GKE) for containerized workloads.

Collaboration and Stakeholder Engagement :


- Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals.

- Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation.

Quality Assurance & Optimization :


- Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations.

- Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines.

- Monitor and optimize pipeline performance to meet SLAs and minimize operational costs.

Qualifications and Certifications :


Education :


- Bachelor's or master's degree in computer science, Information Technology, Engineering, or a related field.

Experience :


- Minimum of 7 to 9 years of experience in data engineering, with at least 4 years working on GCP cloud platforms.

- Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer.

Certifications :


- Google Cloud Professional Data Engineer certification preferred.

Key Skills :


Mandatory Skills :


- Advanced proficiency in Python for data pipelines and automation.

- Strong SQL skills for querying, transforming, and analyzing large datasets.

- Strong hands-on experience with GCP services, including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine and Kubernetes Engine (GKE).

- Hands-on experience with CI/CD tools such as Jenkins, GitHub or Bitbucket.

- Proficiency in Docker, Kubernetes, Terraform or Ansible for containerization, orchestration, and infrastructure as code (IaC)

- Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer

- Strong understanding of Agile/Scrum methodologies

Nice-to-Have Skills :


- Experience with other cloud platforms like AWS or Azure.

- Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau).

- Understanding of machine learning workflows and their integration with data pipelines.

info-icon

Did you find something suspicious?