HamburgerMenu
hirist

Job Description

Description :



- We are seeking a highly skilled Azure Data Engineer with expertise in Azure Data Factory (ADF) and Azure Synapse Analytics to join our dynamic team within our organization.

- This role involves integrating seamlessly with Internal Development Platforms (IDP) and other tools to enhance the developer experience on the Azure.

- The ideal candidate will have a strong background in data integration, ETL processes, and data warehousing.

- This role involves designing, developing, and maintaining data solutions that support our business intelligence and analytics initiatives.

Know Your Team :



- At ValueMomentums Technology Solution Centers, we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain.

- We achieve this through a strong engineering foundation and by continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes.

- Our core expertise lies in six key areas : Platforms, Infra/Cloud, Application, Data, Core, and Quality Assurance.

- Join a team that invests in your growth.

- Our Infinity Program empowers you to build your career with role-specific skill development, leveraging immersive learning platforms.

- You'll have the opportunity to showcase your talents by contributing to impactful projects.

Responsibilities :



- Good experience in writing SQL, Python and PySpark programming.

- Create the Pipelines (simple and complex) using ADF.

- Work with other Azure stack modules like Azure Data Lakes, SQL DW.

- Must be extremely well-versed in handling a large volume of data.

- Understand the business requirements for the Data flow process needs.

- Understand requirements, functional and technical specification documents.

- Development of a mapping document and transformation business rules as per the scope and requirements,/Source to target.

- Responsible for continuous formal and informal communication on project status.

- Good understanding of the JIRA stories process for SQL development activities

Requirements :



- 8 to 13 years of Experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, and data warehouse solutions.

- Extensive hands-on experience implementing data migration and data processing using Azure services : Azure Data Lake Storage, Azure Data Factory, Azure Functions, Synapse, Azure SQL DB, etc.

- Experience in Data Sets, Data Frame, Azure Blob & Storage Explorer

- Must have hands-on experience in programming like Python/PySpark using Data frames, SQL and procedural SQL languages.

- Well-versed in DevOps and CI/CD deployments.

- Have good understanding of Agile/Scrum methodologies.

- Strong attention to detail in high-pressure situations.

- Experience in the insurance (e.g., UW, Claim, Policy Issuance) or financial industry preferred

- Excellent problem-solving skills and the ability to troubleshoot complex issues in distributed systems.

- Effective communication and collaboration skills, with the ability to effectively interact with stakeholders at all levels.


info-icon

Did you find something suspicious?