We are seeking highly skilled Data Engineer experienced in Azure Data Factory (ADF) and API-based data integration to lead as a subject matter expert in our Business Intelligence and Data Engineering team.
This individual will play a key role in modernizing the integration between Applied EPIC agency management system and Salesforce CRM, replacing existing Informatica workflows with Microsoft-native services across Azure.
The ideal candidate has hands-on experience designing and implementing cloud-based data pipelines, working with REST/SOAP APIs and SDKs, and optimizing end-to-end data flow orchestration within Azure.
This position requires strong technical depth in Azure integration services, strong SQL, SOAP XML/SharePoint API knowledge, XML and JSON manipulation in SQL, a collaborative mindset, leading a team for delivery and a disciplined approach to data quality, governance, and operational monitoring.
Key Responsibilities :
- Design and build modern data integration pipelines in Azure Data Factory to replace legacy Informatica workflows between EPIC AMS and Salesforce.
- Develop and maintain Azure Functions (.NET or Python) to interface with EPIC SDK APIs for reading, writing, and bulk updating policy, client, and transaction data.
- Implement Logic Apps workflows to orchestrate near real-time integrations, data refreshes, and error-handling processes.
- Configure and manage Azure API Management (APIM) for secure and scalable API calls to EPIC and Salesforce endpoints.
- Design robust data flow orchestration patterns using ADF pipelines, triggers, and linked services across OneLake/ADLS Gen2.
- Implement monitoring and logging using Application Insights, Log Analytics, and Azure Monitor to track pipeline health and performance.
- Work closely with Salesforce, BI, and Data Architecture teams to align data models and ensure consistent schema mappings across systems.
- Support data transformation and validation using Mapping Data Flows, Synapse Notebooks, or Fabric Dataflows where needed.
- Build reusable frameworks for error handling, retries, and dead-letter queues using Service Bus and Event Grid.
- Enforce data governance and compliance (PII, audit trails, IPE) through secure credential management in Azure Key Vault and structured logging.
- Contribute to CI/CD pipelines via Azure DevOps or GitHub Actions for version control, testing, and deployment automation.
Required Skills and Experience :
- 5+ years of experience as a Data Engineer or Integration Developer, with at least 3 years in Azure Data Factory.
- Proven experience integrating on-premise or SaaS systems via APIs/SDKs, preferably with Applied EPIC, Salesforce, or similar CRM/AMS systems.
- Proficiency in ADF pipeline orchestration, including Copy Activity, REST connectors, Web Activities, and Custom Activities.
- Strong programming skills in C# (.NET) or Python for API development and automation.
- Hands-on experience with Logic Apps, Service Bus, Event Grid, and Azure Functions.
- Working knowledge of Azure API Management, Key Vault, Application Insights, and Azure Monitor.
- Solid understanding of data modeling, ETL/ELT, and data quality frameworks within Azure ecosystem.
- Familiarity with Power BI, Fabric Lakehouse, or Synapse Analytics is a plus.
- Excellent problem-solving, documentation, and communication skills with a collaborative, delivery-focused approach.
Preferred Qualifications :
- Experience with Applied EPIC SDKs or agency management system APIs.
- Exposure to Salesforce Bulk API, SOQL, or Salesforce data model.
- Understanding of insurance brokerage, financial, or CRM data domains.
- Azure certifications such as DP-203, DP-500, or PL-300.
- Experience with CI/CD using Azure DevOps Pipelines or GitHub Actions.
Key Outcomes :
- Replacement of Informatica pipelines with Azure-native orchestration in ADF and Logic Apps.
- Reliable, auditable, and automated integration of EPIC and Salesforce data.
- Scalable and maintainable API-driven data architecture aligned with Microsoft Fabric strategy.
- Improved monitoring, performance, and data quality through centralized observability tools.