HamburgerMenu
hirist

AuxoAI - Data Engineer - Python/SQL/ETL

hirist.tech
Bangalore
2 - 4 Years

Posted on: 18/11/2025

Job Description

Note : If shortlisted, you will be invited for initial rounds on 6th December'25 (Saturday) in Bangalore


Description :


AuxoAI is hiring a Data Engineer to join our growing data engineering team focused on building production-grade pipelines on Google Cloud Platform (GCP). This is a hands-on role ideal for someone early in their data career whos eager to learn fast, work with modern cloud-native tools, and support the development of scalable data systems.

Youll work alongside experienced engineers and analysts on real client projects across industries, helping implement ELT processes, BigQuery pipelines, orchestration workflows, and foundational MLOps capabilities.

Responsibilities :


- Assist in developing scalable data pipelines using GCP tools such as Dataflow, BigQuery, Cloud Composer, and Pub/Sub

- Write and maintain SQL and Python scripts for data ingestion, cleaning, and transformation

- Support the creation and maintenance of Airflow DAGs in Cloud Composer for orchestration

- Collaborate with senior data engineers and data scientists to implement data validation and monitoring checks

- Participate in code reviews, sprint planning, and cross-functional team meetings

- Help with documentation and knowledge base creation for data workflows and pipeline logic

- Gain exposure to medallion architecture, data lake design, and performance tuning on BigQuery

Requirements :


- 2 to 4 years of relevant experience in data engineering, backend development, or analytics engineering

- Strong knowledge of SQL and working-level proficiency in Python

- Exposure to cloud platforms (GCP preferred; AWS/Azure acceptable)

- Familiarity with data pipeline concepts, version control (Git), and basic workflow orchestration

- Strong communication and documentation skills

- Eagerness to learn, take feedback, and grow under mentorship

Bonus Skills :


- Hands-on experience with GCP tools like BigQuery, Dataflow, or Cloud Composer

- Experience with dbt, Dataform, or Apache Beam

- Exposure to CI/CD pipelines, Terraform, or containerization (Docker)

- Knowledge of basic data modeling and schema design concepts


info-icon

Did you find something suspicious?