HamburgerMenu
hirist

Full Stack Data Engineer - Python/Airflow

FUTURES AND CAREERS
Tamil Nadu
4 - 6 Years
star-icon
4.5white-divider19+ Reviews

Posted on: 11/08/2025

Job Description

About the Role :

We are seeking a skilled and versatile Full Stack Data Engineer with strong expertise in Python and cloud-based data engineering tools to design, develop, and maintain end-to-end data pipelines and infrastructure on Google Cloud Platform (GCP). The ideal candidate will have hands-on experience with modern data orchestration, transformation, and storage technologies, enabling scalable and reliable data solutions that support analytics and business intelligence.

Key Responsibilities :

- Design, develop, and maintain scalable data pipelines using Python, PySpark, and Apache Airflow for data ingestion, transformation, and processing.

- Build and manage data workflows with orchestration tools such as Airflow and Tekton to automate data movement and processing.

- Develop and deploy data transformation scripts using DBT (Data Build Tool) and DataForm.

- Work with Google Cloud Platform services including Dataproc, Cloud Storage, BigQuery, Pub/Sub, and Data Fusion to create integrated data solutions.

- Implement infrastructure as code using Terraform to provision and manage cloud resources efficiently.

- Design and develop RESTful APIs to facilitate data access and integration across systems.

- Collaborate with data scientists, analysts, and product teams to understand data requirements and deliver high-quality datasets.

- Optimize data models and queries for performance and cost efficiency in BigQuery and other data stores.

- Implement monitoring, alerting, and logging to ensure reliability and troubleshoot issues in data pipelines.

- Adhere to data governance, security best practices, and compliance requirements.

- Document data architectures, processes, and standards to maintain knowledge sharing and consistency.

Required Skills and Qualifications

- Bachelors degree in Computer Science, Engineering, or a related field.

- Proven experience in data engineering with strong Python programming skills.

- Hands-on experience with Apache Airflow for workflow orchestration and automation.

- Proficiency in PySpark and working with big data processing frameworks.

- Experience with Google Cloud Platform services : Dataproc, Cloud Storage, BigQuery, Pub/Sub, Data Fusion.

- Familiarity with infrastructure as code using Terraform.

- Experience with data transformation tools such as DBT and DataForm.

- Strong understanding of data modeling, ETL/ELT processes, and data warehousing concepts.

- Ability to develop and consume RESTful APIs.

- Knowledge of containerization and CI/CD pipelines is a plus.

- Excellent problem-solving skills and attention to detail.

- Strong communication and collaboration skills.


info-icon

Did you find something suspicious?