Posted on: 22/04/2026
Role Overview :
As a Data Engineer at Generali Central Insurance, you will play a crucial role in building and maintaining our data infrastructure, ensuring the availability, reliability, and quality of data for critical business decisions. You will collaborate closely with data scientists, business analysts, and other stakeholders to understand their data needs and develop solutions that enable data-driven insights. Your work will directly impact the efficiency of our operations, the effectiveness of our risk management, and the overall success of our business.
Key Responsibilities :
- Data Pipeline Development : Build and maintain ETL/ELT pipelines using tools like MSSQL ,Python, Apache Airflow, and Talend
- Build, optimize and maintain data models & data-marts to support analytical and reporting requirements for business users.
- Support to monitor and enforce data quality standards and procedures to ensure data accuracy and consistency.
- Support to maintain data governance policies and procedures to ensure data security and compliance.
- Monitor and troubleshoot data pipeline performance to ensure optimal data availability and reliability.
- Collaborate with data scientists to build and deploy machine learning models.
Required Skillset :
- Demonstrated ability to design and implement data warehousing solutions using SQL and other database technologies.
- Proven experience in developing and maintaining ETL pipelines using tools such as Python.
- Strong understanding of data modeling principles and techniques.
- Experience with data quality and data governance practices.
- Familiarity with BI/DW concepts and tools.
- Excellent communication and collaboration skills.
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Ability to work effectively in a fast-paced, dynamic environment.
- Expected to perform independently and become an SME in next 2-3 years
Technical Skillset :
- Working knowledge and hands-on day-to-day experience of maintaining and supporting Data Pipelines, Data Marts, Data Orchestration process in mid-to-large scale organization
- Good knowledge of SQL [MSSQL preferred], Apache Airflow, DAGs along with optimization techniques
- Intermediate knowledge of python with experience of relevant libraries for Data Engineering
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1630239