Posted on: 28/01/2026
Description :
Auxo is looking for a Senior Data Engineer - GCP Native Platform to design, build, and operate scalable, production-grade data platforms for our enterprise clients.
This is a hands-on delivery role, not a support function. You will own end-to-end data pipeline engineering on Google Cloud Platform, embedding best-in-class data engineering practices across batch and streaming workloads, while working closely with architects, analysts, and data scientists.
Location : Bangalore, Hyderabad, Mumbai, and Gurgaon
Responsibilities :
- Design, build, and optimize scalable batch and streaming data pipelines using Dataflow (Apache Beam)
- Develop and manage workflow orchestration using Airflow on Cloud Composer
- Implement ELT transformations using Dataform for SQL-based data modeling and transformations
- Design and maintain BigQuery datasets following layered / medallion architecture patterns
- Implement event-driven ingestion and CDC patterns using Pub/Sub
- Partner with architects to implement technical designs, standards, and platform best practices
- Ensure performance optimization, reliability, monitoring, and cost efficiency of data pipelines
- Implement data quality checks, validations, and monitoring within pipelines
- Support production deployments, incident resolution, and operational stability
- Mentor junior engineers and contribute to engineering excellence across the team
Requirements :
Required Skills & Experience :
- Data Engineering (Strong Hands-on Experience)
- Design and development of production-grade data pipelines
- Batch and streaming data processing architectures
- Workflow orchestration and dependency management
- Data modeling, schema design, and performance optimization
- Pipeline monitoring, troubleshooting, and cost optimization
GCP Data Platform :
- Hands-on experience with BigQuery (advanced SQL, partitioning, clustering, optimization)
- Strong experience with Dataflow / Apache Beam (Python or Java)
- Experience with Cloud Composer / Airflow
- Experience with Pub/Sub and Cloud Storage
Technical Foundation :
- Strong proficiency in SQL and Python (Java is a plus)
- Solid understanding of ETL/ELT patterns and modern data stack concepts
- Experience with Git-based version control and CI/CD pipelines
- Working knowledge of cloud monitoring and logging
Preferred Qualifications :
- Experience with GCP Professional Data Engineer certification
- Exposure to Dataform or dbt for transformation workflows
- Experience with real-time streaming architectures
- Familiarity with Vertex AI, Cloud Functions, or Dataproc
- Understanding of data governance concepts and platforms (Dataplex, Atlan, Collibra)
- Experience with legacy-to-cloud data migrations
- Familiarity with Looker or Power BI
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1606586