HamburgerMenu
hirist

Job Description

Description :

Leading technology solutions provider specializing in data-driven insights and digital transformation for the financial services and healthcare industries. We empower our clients with cutting-edge data engineering and analytics solutions, enabling them to make informed decisions and optimize their operations. Our commitment to innovation and client success has established us as a trusted partner for organizations seeking to leverage the power of data.

Role Overview :

As a Data Engineer you will be responsible for designing, building, and maintaining scalable and reliable data pipelines that power our analytics and reporting platforms. You will collaborate closely with data scientists, analysts, and other engineers to understand data requirements, develop ETL processes, and ensure data quality. Your work will directly impact our ability to deliver actionable insights to our clients, driving business growth and improving patient outcomes.

Key Responsibilities :

- Design and implement robust ETL pipelines using Python, Spark, and AWS Glue to ingest, transform, and load data from various sources into our data warehouse.

- Develop and maintain data models and schemas to ensure data consistency and integrity across our data platforms.

- Monitor and troubleshoot data pipeline performance, identifying and resolving bottlenecks to ensure optimal data delivery.

- Collaborate with data scientists and analysts to understand their data needs and provide them with the data they require for their analyses.

- Implement data quality checks and validation processes to ensure the accuracy and reliability of our data.

- Automate data pipeline deployment and monitoring using infrastructure-as-code principles.

- Contribute to the development of data engineering best practices and standards within the organization.

Required Skillset :

- Demonstrated ability to design, develop, and maintain ETL pipelines using Python and Spark.

- Proven experience with AWS Glue or similar cloud-based ETL services.

- Strong understanding of data warehousing concepts and data modeling techniques.

- Ability to write complex SQL queries and optimize query performance.

- Experience with data quality monitoring and validation processes.

- Excellent communication and collaboration skills, with the ability to work effectively in a team environment.

- Bachelor's degree in Computer Science, Engineering, or a related field.

- Adaptability to work in a fast-paced, dynamic environment.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in