HamburgerMenu
hirist

Job Description

Key Responsibilities :

- Develop and maintain data ingestion pipelines from PostgreSQL source systems to the Enterprise Data Platform (Snowflake and Redshift).

- Design and implement robust ETL workflows using DBT, ensuring data accuracy and performance.

- Orchestrate and schedule data workflows using Apache Airflow.

- Manage and optimize data storage in AWS S3, including Iceberg tables.

- Handle Parquet data formats for efficient reporting and analytics consumption.

- Monitor pipeline performance, resolve bottlenecks, and troubleshoot data quality issues.

- Collaborate with QA teams and data scientists to ensure end-to-end data integrity.

- Follow industry-standard coding best practices and actively participate in code reviews.


Required Skills and Qualifications :


- Strong SQL skills with experience in PostgreSQL.

- Proven experience with Snowflake and AWS S3 for data warehousing and storage.

- Hands-on experience with DBT (Data Build Tool) for data modeling and transformation.

- Proficiency in Apache Airflow for data orchestration and scheduling.

- Familiarity with data lakehouse architecture, Iceberg table formats, and Parquet.

- Solid Python programming skills and experience with API integrations.

- Experience working with large-scale datasets, ensuring performance and scalability.

- Strong problem-solving, communication, and teamwork abilities.


Preferred Qualifications :


- Experience in cloud-native data platforms and modern data stack tools.

- Background in data governance, lineage, or metadata management.

- Familiarity with CI/CD pipelines and DevOps practices in data engineering.


info-icon

Did you find something suspicious?