Posted on: 28/04/2026
Company Overview :
Xponentium is a high-growth technology company specializing in AI-powered data engineering, skill assessments, and cloud optimization. We partner with global healthcare and financial institutions to transform raw data into actionable intelligence. As an CL Data Engineeer, you will play a critical role in building and scaling our proprietary data framework, ensuring that our clientsranging from regional health systems to global insurance leadersbenefit from a seamless, unified data experience.
Key Responsibilities & Client Impact :
- Design & Scale : Design, develop, and maintain scalable data solutions using Snowflake and Microsoft Data Fabric to support Xponentiums core data products.
- Client Collaboration : Work directly with stakeholders from key Xponentium clients, including The Cigna Group, MetLife, and specialty healthcare networks, to translate complex business requirements into technical data solutions.
- Design, develop, and maintain scalable data solutions using Snowflake and Microsoft Data Fabric.
- Build and optimize complex SQL queries, views, and transformations for large scale healthcare datasets.
- Should have experience to Python
- Develop and manage data pipelines, semantic models, and data products using Microsoft Data Fabric components (Lakehouse, Dataflows, Warehouses, Pipelines).
- Analyze healthcare data (claims, eligibility, provider, clinical, pharmacy) to support reporting, analytics, and business insights.
- Collaborate with business analysts and stakeholders to translate healthcare business requirements into technical data solutions.
- Ensure data quality, accuracy, performance optimization, and adherence to healthcare data standards.
- Perform root cause analysis for data issues and implement robust, scalable fixes.
- Support UAT, data validation, reconciliation, and production support activities.
- Ensure compliance with healthcare regulations and internal data governance standards (HIPAA, PHI/PII handling).
- Contribute to design reviews, code reviews, and best practices for data engineering and analytics solutions.
QUALIFICATION :
- Requires an BA/BS degree in Information Technology, Computer Science or related field of study.
EXPERIENCE :
- 5+ years of experience in data engineering / software engineering roles.
- Strong hands on experience with Snowflake (data modeling, performance tuning, SQL optimization), Python
- Strong experience with Microsoft Data Fabric (Lakehouse, Dataflows Gen2, Pipelines, Warehouses).
- Expert level SQL skills for data analysis, transformation, and validation.
- Strong understanding of data warehousing concepts, dimensional modeling, and ETL/ELT patterns.
- Experience working with large, complex datasets in cloud data platforms.
SKILLS AND COMPETENCIES :
- Microsoft Data Fabric (Lakehouse, Dataflows Gen2, Pipelines, Warehouses).
- Strong hands on experience with Snowflake (data modeling, performance tuning, SQL optimization).
The job is for:
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1631879