Posted on: 12/07/2025
Key Responsibilities
- Design, develop, and maintain robust ETL pipelines using Azure Data Factory (ADF) to support complex insurance data workflows.
- Integrate and extract data from various Guidewire modules (PolicyCenter, BillingCenter, ClaimCenter) ensuring data quality, integrity, and consistency.
- Build reusable components for data ingestion, transformation, and orchestration across Guidewire and Azure ecosystems.
- Optimize ADF pipelines for performance, scalability, and cost-efficiency while adhering to industry-standard DevOps and CI/CD practices.
- Collaborate with solution architects, data modelers, and Guidewire functional teams to translate business requirements into scalable ETL solutions.
- Conduct thorough unit testing, data validation, and error handling across all data transformation steps.
- Participate in end-to-end data lifecycle management, from requirement gathering through deployment and post-deployment support.
- Provide technical documentation, pipeline monitoring dashboards, and ensure production readiness.
- Support data migration projects involving legacy platforms to Azure cloud environments.
- Follow Agile/Scrum practices, contribute to sprint planning, retrospectives, and stand-ups with strong ownership of deliverables.
Mandatory Skills :
- 6+ years of experience in data engineering with strong command over Azure Data Factory, Azure SQL, and related Azure services.
- Deep hands-on experience in building ADF pipelines integrating with Guidewire Insurance Suite.
- Proficiency in data transformation using SQL, Stored Procedures, and Data Flows.
- Experience working on Guidewire data models, understanding of PC/Billing/Claim schema and business entities.
- Strong understanding of cloud-based data warehousing concepts, data lake patterns, and data governance best practices.
- Clear experience in integrating Guidewire systems with downstream reporting and analytics platforms.
- Excellent debugging skills, with the ability to resolve complex data transformation and pipeline performance issues.
Preferred Skills :
- Prior experience in Insurance (P&C preferred) domain or implementing Guidewire DataHub and/or InfoCenter.
- Familiarity with Power BI, Databricks, or Synapse Analytics.
- Working knowledge of Git-based source control, CI/CD pipelines, and deployment automation.
Additional Requirements :
- Work Mode: 100% Onsite Hyderabad office (No remote/hybrid flexibility).
- Strong interpersonal and communication skills; must be capable of working with cross-functional teams and client stakeholders.
- Self-starter mindset with a high sense of ownership; must thrive under pressure and tight deadlines.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1511420
Interview Questions for you
View All