Posted on: 04/02/2026
Description :
Job Title : Data Engineer
Location : Mumbai, Pune, Bangalore, Chennai, Hyderabad
Experience : 5 - 10 years Employment Type: Full-time
Role Overview :
We are seeking a Data Engineer to support our enterprise data modernization initiative. This role involves building scalable Azure ADF ingestion pipelines, developing Snowflake-based ELT across Bronze/Silver/Gold layers, improving data quality, and powering analytics products.
The engineer will collaborate with cross-functional teams to deliver secure, high-quality, and high-performance data solutions that enable clinical, operational, and financial insights.
Key Responsibilities :
- Design and develop Azure ADF pipelines to ingest data from EHRs, billing systems, HL7 feeds, CSV/flat files, and third-party sources.
- Implement metadata-driven ingestion patterns with ADLS landing/bronze zones.
- Build and optimize Snowflake ELT workflows using tasks, streams, stored procedures, and warehouse performance optimization.
- Transform Bronze - Silver - Gold layers following Medallion Architecture and business rules for RCM, ACO, patient/provider domains.
- Implement data quality checks, reconciliation logic, schema drift handling, and monitoring controls.
- Ensure HIPAA-compliant handling of PHI/PII using secure views, role-based access, and minimum necessary principles.
- Create datasets optimized for Power BI semantic models and enterprise reporting needs.
- Support DevOps processes: version control, pull requests, CI/CD, and release readiness using Azure DevOps.
- Collaborate with BAs, Data Modelers, BI developers, QA engineers, DevOps, and Architects in Agile/Scrum ceremonies.
- Maintain documentation, lineage, technical specs, and metadata using Purview or equivalent tools.
Required Skills & Experience :
- Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related technical field (cloud/data certifications preferred).
- Snowflake (4+ yrs): ELT development, SQL tuning, warehouse optimization, micro-partitioning, clustering.
- Azure Data Factory (2+ yrs): Pipelines, Data Flows, triggers, parameterization, ingestion frameworks.
- Python (2+ yrs): File ingestion, parsing, API extraction, schema checks, DQ automation, orchestration helpers integrated with ADF.
- Understanding of healthcare data (HL7, claims, provider/member, encounters) is a plus.
- Strong knowledge of data quality controls, logging, monitoring, and error handling.
- Experience working in Agile/Scrum environments using Jira or Azure DevOps.
Preferred Qualifications :
- Experience with Medallion Architecture in data lake/warehouse environments.
- Exposure to data governance tools (Purview, Collibra, etc.).
- Knowledge of Power BI semantic modeling and enterprise reporting.
- Advanced certifications in Azure, Snowflake, or Data Engineering.
Why Join Us :
- Work on cutting-edge healthcare data modernization projects.
- Opportunity to design enterprise-scale data pipelines with Azure and Snowflake.
- Collaborative, Agile environment with strong focus on innovation and compliance.
- Competitive compensation and benefits package.
Did you find something suspicious?
Posted by
Swati.Pawar
NA at CitiusTech Healthcare Technology Pvt Ltd.
Last Active: NA as recruiter has posted this job through third party tool.
Functional Area
Other
Job Code
1609516