Posted on: 18/03/2026
Job Description :
- 3-7 years of hands-on experience in ETL development, with a strong focus on designing, building, and maintaining scalable and efficient data integration solutions.
- Proven expertise in the MSBI stack, including SSIS (SQL Server Integration Services), SQL Server, and advanced T-SQL programming for data extraction, transformation, and loading processes.
- Strong understanding of data warehousing concepts, including dimensional modeling (star/snowflake schemas), data marts, and data lifecycle management.
- Demonstrated experience in the insurance domain, with deep functional knowledge of Claims processing, claims lifecycle, and related data structures.
- Experience working on data migration projects, including legacy system analysis, data mapping, transformation logic, validation, and reconciliation. (Preferred)
- Hands-on or working knowledge of Duck Creek Claims, including its data model, architecture, and integration patterns. (Preferred)
- Exposure to modern data platforms and cloud-based ETL tools such as Azure Data Factory (ADF), Snowflake, and/or Databricks, particularly in data integration, transformation, and migration scenarios. (Preferred)
- Experience working in Agile/Scrum environments, actively participating in sprint planning, daily stand-ups, retrospectives, and backlog grooming using tools like JIRA and Confluence.
- Strong analytical and problem-solving skills, with the ability to troubleshoot complex data issues and optimize performance.
- Excellent communication and stakeholder management skills, with the ability to translate business requirements into technical solutions.
- Self-driven and proactive, with the ability to quickly ramp up on new technologies and deliver high-quality results with minimal supervision.
Key Responsibilities :
- Design, develop, and maintain robust ETL pipelines using SSIS, including implementation of complex data transformations, reusable components, logging frameworks, and exception handling mechanisms.
- Perform performance tuning and optimization of ETL jobs and database queries to ensure high efficiency, scalability, and reliability.
- Develop and maintain SQL Server database objects, including complex T-SQL queries, stored procedures, views, and functions, ensuring adherence to coding standards and best practices.
- Support and contribute to data modeling initiatives, including the design and implementation of data warehouses and data marts to support reporting, analytics, and business intelligence needs.
- Lead or assist in data migration initiatives, including data extraction from legacy systems, transformation, validation, and loading into target platforms, ensuring data integrity and consistency.
- Work closely with Duck Creek Claims systems, understanding its data flow, integrations, and supporting enhancements or issue resolution related to claims data processing.
- Collaborate with teams working on cloud-based data pipelines using Azure Data Factory (ADF), Snowflake, or Databricks, contributing to modern data architecture and migration strategies.
- Partner with cross-functional teams including business analysts, QA teams, and product stakeholders to gather requirements, design solutions, and ensure successful delivery.
- Actively participate in Agile/Scrum ceremonies, contributing to sprint deliverables, progress tracking, and continuous improvement initiatives.
- Ensure data quality, governance, and compliance standards are maintained across all ETL processes and data solutions.
Did you find something suspicious?
Posted by
Posted in
Data Analytics & BI
Functional Area
Data Engineering
Job Code
1621699