Posted on: 08/07/2025
Position Primary Responsibilities :
- Lead the design, development, optimization, and maintenance of scalable ETL workflows and data pipelines.
- Use ETL tools to transform data from external sources into proprietary internal formats.
- Develop and maintain ETL workflows to automate data transformation processes.
- Migrate and modernize existing SSIS packages to Azure Data Factory (ADF).
- Refactor database schemas and ETL logic to support generic, configuration-driven processing.
- Implement reusable ETL components and templates to reduce duplication and improve standardization.
- Ensure data quality by implementing validation, reconciliation, and error handling mechanisms.
- Collaborate with DBAs, data architects, analysts, and business stakeholders to understand data requirements and propose scalable solutions.
- Implement logging, monitoring, and alerting for ETL workflows to support operational reliability.
- Create and maintain documentation for ETL logic, data lineage, and workflow architecture.
- Work with DevOps teams to implement CI/CD practices for ETL deployments (e. , via Azure DevOps or GitHub Actions).
- Participate in peer code reviews and uphold development best practices.
- Mentor junior ETL developers and contribute to internal knowledge sharing.
Minimum Skills :
- 6+ years of experience in ETL development, including strong hands-on experience with SSIS.
- 3+ years of experience with Azure Data Factory (ADF) or other cloud-based ETL tools.
- Proficient in SQL and working with relational databases such as SQL Server, Azure SQL, etc.
- Solid understanding of data modelling, data warehousing, and ETL design patterns.
- Strong analytical skills with the ability to evaluate and compare the costs, benefits, and trade-offs of multiple architectural options including suitability for purpose, long-term maintainability, future viability, and cost effectiveness.
- Experience developing parameterized, metadata-driven ETL pipelines.
- Strong understanding of data exchange formats such as EDI, cXML, and other structured formats used in B2B integrations.
- Familiar with source integration methods, including REST APIs, SFTP, flat files, and cloud data lakes.
- Comfortable in Agile/Scrum environments and familiar with version control and deployment workflows.
- Demonstrated experience in refactoring legacy ETL systems for modernization.
Preferred Skills :
- Experience with other Azure data tools like Synapse Analytics, Azure Functions, Logic Apps, or Databricks.
- Familiarity with data security, masking, and anonymization, especially in regulated domains.
- Experience in data governance, lineage tracking, and cataloguing tools (e., Purview, Collibra).
- Knowledge of scripting languages such as Python or PowerShell for data processing or automation.
- Experience working in hybrid data environments (on-prem + cloud).
- Familiarity with healthcare or supply chain industries, which often utilize EDI for data exchange.
- Knowledge of API integration principles and RESTful web services.
- Experience with data mapping and schema definition tools (e., Altova MapForce, XSD tools, or equivalent).
- Experience with Corepoint Integration Server or similar integration
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1509226
Interview Questions for you
View All