HamburgerMenu
hirist

Informatica Developer - Hive/Spark

Collabera
Multiple Locations
4 - 6 Years

Posted on: 03/10/2025

Job Description

Location :


This position requires to work from : Kuala Lumpur, Malaysia.


Key Responsibilities :

- Design, develop, and maintain ETL processes using Informatica PowerCenter or similar tools.

- Work with Teradata for efficient data extraction, transformation, and loading.

- Implement scalable data processing solutions using Hadoop ecosystem components (e.g., Hive, Pig, HDFS, Spark).

- Collaborate with data architects, analysts, and stakeholders to understand business

requirements and translate them into technical specifications.

- Optimize performance of ETL workflows and Teradata queries for large datasets.


- Perform data quality checks and ensure data integrity across platforms.

- Participate in code reviews, testing, deployment, and documentation processes.

- Troubleshoot and resolve data-related issues in a timely manner.

- Ensure compliance with data governance and security policies.


Required Qualifications :


- Bachelors degree in computer science, Information Technology, or a related field.

- 4+ years of hands-on experience in ETL development.

- Proven experience with Informatica PowerCenter.

- Experience in Teradata including SQL, BTEQ scripting, and performance tuning.

- Working knowledge of the Hadoop ecosystem (Hive, HDFS, Spark, etc.)

- Solid understanding of data warehouse concepts and best practices.

- Strong analytical and problem-solving skills.

- Excellent communication and teamwork abilities.

- Knowledge on FSL-DM is good to have.


info-icon

Did you find something suspicious?