HamburgerMenu
hirist

Job Description

Description :

We are seeking a skilled Senior Software Engineer Informatica / Data / ETL to design, develop, and maintain robust data integration and transformation workflows.

The ideal candidate will bring strong hands-on expertise in Informatica, SQL, and ETL pipeline development, along with experience in data integration across cloud platforms.

This role demands a combination of technical depth and problem-solving skills enabling the design of efficient, scalable, and secure data pipelines that power analytics, reporting, and enterprise decision systems.

You will collaborate with cross-functional teams to translate business requirements into optimized data workflows that drive accuracy, performance, and reliability.

What Youll Do :

- Design, develop, and manage ETL processes using Informatica PowerCenter / IICS to integrate data from multiple sources.

- Create and optimize data pipelines for structured, semi-structured, and unstructured data across on-premise and cloud environments.

- Write efficient SQL scripts and stored procedures for data transformation, cleansing, and validation.

- Collaborate with data architects and analysts to design and implement data models that support analytics and reporting requirements.

- Troubleshoot and tune ETL jobs for optimal performance, scalability, and reliability.

- Implement data integration strategies across multiple systems and ensure adherence to quality and governance standards.

- Automate recurring data workflows and develop reusable components for streamlined development.

- Participate in code reviews, unit testing, and documentation to ensure high-quality deliverables.

- Work closely with cloud engineering teams to support migrations and deployments on cloud data platforms such as AWS, Azure, or GCP.

- Contribute to continuous improvement initiatives within the data engineering ecosystem.

What You Bring :

- Minimum of 3 years of hands-on experience in ETL and data integration using Informatica (PowerCenter, IICS, or similar).

- Proficiency in SQL with the ability to write complex queries, optimize joins, and manage large datasets.

- Experience designing data pipelines and transformation logic for enterprise data systems.

- Strong understanding of data warehousing concepts, ETL workflows, and data lifecycle management.

- Familiarity with cloud data platforms such as AWS Redshift, Azure Synapse, or Google BigQuery.

- Exposure to data quality frameworks, metadata management, and data lineage tracking.

- Excellent debugging, analytical, and performance-tuning skills.

- Ability to work in Agile / Scrum environments with a collaborative mindset.

- Strong communication and documentation skills for technical discussions and stakeholder interactions.

- Bachelors or Masters degree in Computer Science, Information Technology, or a related field.

Preferred Skills :

- Experience with Python or Shell scripting for ETL automation and orchestration.

- Familiarity with data orchestration tools such as Airflow or Control-M.

- Exposure to data lake architectures and modern data integration frameworks.

- Understanding of CI/CD pipelines for data deployments


info-icon

Did you find something suspicious?