Posted on: 08/01/2026
Role Overview :
We are looking for an experienced Data Engineer with strong expertise in Google Cloud Platform (GCP) to join our healthcare technology team. The role will focus on building scalable data pipelines, orchestrating workflows using GCP Dataflow, Cloud Composer, and related services. Having hands on experience on SQL. The ideal candidate will bring a mix of technical depth in cloud data engineering and an understanding of healthcare data standards. Strong experience with migration projects.
Key Responsibilities :
- Design, build, and maintain data pipelines on Google Cloud Platform using Dataflow (Apache Beam).
- Develop and manage orchestration workflows using Cloud Composer (Airflow).
- Ingest, transform, and process large-scale data with a focus on performance, scalability, and compliance.
- Collaborate with business analysts and healthcare SMEs to understand workflows and translate them into data solutions.
- Optimize data pipelines for cost efficiency, performance, and scalability.
- Ensure data quality, lineage, and governance across claims datasets.
- Integrate structured and unstructured data sources into data lakes/warehouses.
- Implement data security and HIPAA compliance standards in all processes.
- Build reusable frameworks for ETL/ELT processes in a regulated healthcare environment.
- Support Agile delivery model by participating in sprint planning, reviews, and retrospectives.
- Lead the design and development of data pipelines from GCP to Dataverse.
- Implement data ingestion, transformation, and load processes as per architecture and data mapping specifications.
- Work closely with the Data Architect to translate data models and mapping documents into executable pipelines.
- Ensure data quality, validation, reconciliation, and error handling during data movement.
- Perform performance tuning of :
1. GCP queries and transformations
2. Data transfer volumes and batching strategies
3. Dataverse ingestion methods (bulk APIs, dataflows, throttling limits)
- Guide and mentor data engineers; perform code reviews and enforce best practices.
- Support testing, UAT, cutover, and post-go-live stabilization activities.
- Coordinate with cloud, Power Platform, and integration teams to ensure end-to-end data flow.
Required Skills & Qualifications :
- Strong hands-on experience with GCP data services (BigQuery, Dataflow, Cloud Storage, Cloud Composer, pub/sub).
- Strong SQL skills and experience optimizing large-scale data workloads.
- Deep understanding of ETL/ELT performance tuning, batching, parallelism, and retries.
- Experience with data quality frameworks, operational monitoring, and production support.
- Strong experience working in Agile delivery models.
- Good to have familiarity with FHIR, HL7, EDI 837/835, or other healthcare data standards.
- Strong SQL programming skills and performance tuning.
- Experience with ETL/ELT frameworks and large-scale data processing.
- Hands-on experience with CI/CD pipelines for data solutions.
- Familiarity with Power Platform, Azure integration services, and REST APIs.
- Experience integrating data into Microsoft Dataverse, including bulk and incremental ingestion patterns is good to have.
Did you find something suspicious?
Posted by
Khushboo Jaiswal
Talent Acquisition Specialist at INNOVA SOLUTIONS PRIVATE LIMITED
Last Active: 14 Jan 2026
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1598299