Posted on: 30/10/2025
Description :
- Integrating backend APIs with analytical front-ends.
- 3+ yrs of IT experience.
- Good understanding of analytics tools for effective analysis of data.
- Ability to learn new tools and technologies.
- Should have worked on anyone Structural (SQL/Oracle/Postgres) and one Nosql Database.
- Should have very good understanding of DW, Data Mart and Data modelling Concepts.
- Should be part of a Data Warehouse design team at least in one project.
Roles & Responsibilities :
- Develop high performance and scalable solutions using GCP that extract, transform, and load big data.
- Designing and building production-grade data solutions from ingestion to consumption using Java / Python.
- Design and optimize data models on GCP cloud using GCP data stores such as BigQuery.
- Optimizing data pipelines for performance and cost for large scale data lakes.
- Writing complex, highly-optimized queries across large data sets and to create data processing layers.
- Closely interact with Data Engineers to identify right tools to deliver product features by performing POC.
- Collaborative team player that interacts with business, BAs and other Data/ML engineers.
- Research new use cases for existing data.
Preferred Skills :
- Need to be Aware of Design Best practices for OLTP and OLAP Systems.
- Should have exposure to Load testing methodologies, Debugging pipelines and Delta load handling.
- Creation of Dag File using Python and SQL for ETL.
- Having experience in Exploratory analysis of log.
- Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable.
- Experience in Google Cloud Platform (GCP).
- Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow).
- Experience with Spring Boot.
- Knowledge of Microservices, Pub/Sub, Cloud Run, Cloud Function.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1567888
Interview Questions for you
View All