Posted on: 19/11/2025
Description :
Key Responsibilities :
- Ensure data integrity, scalability, security, and reliability across all data solutions.
- Collaborate closely with business analysts, product owners, BI developers, and data scientists to deliver high-quality, actionable data products.
- Utilize CI/CD practices, test automation, and monitoring frameworks to ensure robust,
production-ready delivery.
- Work with data warehousing, orchestration tools such as Airflow / Cloud Composer, Dataflow,
and support data ingestion via APIs.
- Apply DevOps practices throughout the data lifecycle and contribute to continuous
improvements.
- Familiarity with ML engineering concepts and integration with platforms like Vertex AI is a
strong plus.
Required Skills & Experience :
- Strong expertise in Google Cloud Platform : Big Query,Dataflow, Pub/Sub
- Advanced proficiency in SQL and Python.
- Experience with modern data modeling (medallion architecture, star schema, dimensional
modeling).
- Strong understanding of data pipelines, ETL/ELT, and cloud-native processing.
- Experience with APIs, data ingestion frameworks, and orchestrated workflows.
- Knowledge of CI/CD, test automation, monitoring, and DevOps practices.
Qualifications :
Software Engineering, or a related technical field.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1577554
Interview Questions for you
View All