Posted on: 22/10/2025
Job Description :
Key Responsibilities :
- Install, configure, and administer complex Informatica platform environments, specifically covering Informatica PowerCenter, Data Engineering Integration (DEI), and Big Data Management (BDM) components across development, testing, and production landscapes.
- Take full ownership of managing platform performance, conducting regular tuning of the Informatica domain, application services (Integration Service, Repository Service), and grid configurations to maximize throughput and minimize latency.
- Implement and manage platform-level security controls, including user and group management, role-based access control (RBAC), and integration with enterprise identity management systems.
- Plan, manage, and execute major version upgrades, patches, and hotfixes for the Informatica ecosystem with meticulous attention to minimizing production downtime.
- Apply strong knowledge of ETL performance tuning techniques within PowerCenter mappings/workflows and data pipeline optimization principles within the DEI/BDM environment.
- Provide expert guidance to development teams on optimization strategies, connection management, and effective resource consumption.
- Collaborate closely with teams utilizing Hadoop, Spark, and other Big Data technologies, ensuring Informatica integration services are optimally configured for distributed processing environments.
- Collaborate effectively with cross-functional development, infrastructure, and operations teams on data projects, ensuring adherence to data governance policies and platform best practices.
Required Technical Skills & Experience :
- Minimum 4+ years of dedicated experience in the engineering, administration, and performance tuning of Informatica platforms (PowerCenter, DEI/BDM).
- Deep technical expertise in ETL performance tuning, data partitioning, load balancing, and workflow optimization within the Informatica ecosystem.
- Proficiency in Python and Shell Scripting for systems administration, task automation, and managing application services across Linux/Unix environments.
- Working experience with Hadoop distribution platforms and Spark integration, specifically managing Informatica services that leverage these technologies for processing.
- Strong command of underlying RDBMS concepts for repository management and connectivity (e.g., Oracle, SQL Server).
Preferred Skills :
- Informatica Certification (e.g., Informatica Certified Professional - Administrator).
- Experience with cloud platforms (AWS, Azure, or GCP) and administering Informatica products deployed on cloud infrastructure (IaaS/PaaS).
- Familiarity with enterprise scheduling tools (e.g., Control-M, Autosys) integration with Informatica workflows.
- Knowledge of data governance tools and security hardening best practices for data platforms.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1563089
Interview Questions for you
View All