Posted on: 24/09/2025
Job Title : Data Engineer
Work Location : Pune (Client Location Kharadi)
Work Mode : Hybrid
Work Schedule :
Onsite : 3 Days Mandatory (Tuesday, Wednesday, Thursday) | 10 : 00 am 2 : 00 pm IST
WFH : 2 Days (Monday & Friday) | 2 : 30 pm 10 : 30 pm IST
Additional WFH Hours : 6 : 30 pm 10 : 30 pm IST (on onsite days)
About the Role :
We are seeking an experienced Data Engineer to join our team in Pune.
The ideal candidate will have strong expertise in Core Java, Spring Boot, and Microservices, along with hands-on experience in building scalable data pipelines on Google Cloud (BigQuery, Dataflow, Dataproc, Pub/Sub, etc.).
You will be responsible for designing, developing, and optimizing data solutions that support business intelligence and analytics across the organization.
Key Responsibilities :
- Design, develop, and operate scalable, high-performance data applications.
- Build and maintain data pipelines for both real-time and batch processing.
- Implement ingestion, transformation, and integration of diverse data sources.
- Develop and maintain data models and schemas to support analytics use cases.
- Collaborate with cross-functional teams including data scientists, analysts, and product teams.
- Ensure data quality, governance, and security compliance.
- Automate processes, improve efficiency, and enable self-service analytics.
- Troubleshoot production issues and provide ongoing operational support.
- Contribute to architecture discussions, technology roadmaps, and sprint activities.
Required Skills & Experience :
- 6+ years of experience in Core Java, Spring Boot, and Microservices development.
- Strong knowledge of Cloud technologies (GCP preferred; AWS/Azure experience a plus).
- Experience with Google Cloud services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Avro.
- Expertise in designing and building scalable, reliable data pipelines.
- Hands-on experience with data ingestion, transformation, and modeling.
- Familiarity with data visualization tools (Google Data Studio, Tableau, Power BI, or similar).
- Strong debugging and troubleshooting skills for distributed systems.
- Understanding of data governance, compliance, and security best practices.
- Excellent problem-solving, communication, and collaboration skills.
Nice-to-Have :
- Experience with CI/CD, serverless computing, and infrastructure-as-code.
- Knowledge of API integrations and third-party data source connectivity.
- Exposure to globally distributed engineering teams.
The job is for:
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1551618
Interview Questions for you
View All