Posted on: 03/10/2025
Location :
Key Responsibilities :
- Design, develop, and maintain ETL processes using Informatica PowerCenter or similar tools.
- Work with Teradata for efficient data extraction, transformation, and loading.
- Implement scalable data processing solutions using Hadoop ecosystem components (e.g., Hive, Pig, HDFS, Spark).
- Collaborate with data architects, analysts, and stakeholders to understand business
requirements and translate them into technical specifications.
- Optimize performance of ETL workflows and Teradata queries for large datasets.
- Perform data quality checks and ensure data integrity across platforms.
- Participate in code reviews, testing, deployment, and documentation processes.
- Troubleshoot and resolve data-related issues in a timely manner.
- Ensure compliance with data governance and security policies.
Required Qualifications :
- 4+ years of hands-on experience in ETL development.
- Proven experience with Informatica PowerCenter.
- Experience in Teradata including SQL, BTEQ scripting, and performance tuning.
- Working knowledge of the Hadoop ecosystem (Hive, HDFS, Spark, etc.)
- Solid understanding of data warehouse concepts and best practices.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
- Knowledge on FSL-DM is good to have.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1554645
Interview Questions for you
View All