HamburgerMenu
hirist

Senior ETL Developer - SQL/Informatica

Recruitment Hub 365
Multiple Locations
3 - 7 Years

Posted on: 24/09/2025

Job Description

Job Title : Senior ETL Developer Scalable Data Pipelines & High-Performance Systems

Location : Gurugram

Experience : 3+ years

Department : Data Engineering / Analytics

Role Overview :

We are seeking an experienced Senior ETL Developer to design, build, and optimize large-scale data pipelines in high-performance environments. The role requires deep expertise in PostgreSQL, hands-on experience with ETL tools, and the ability to manage highly complex, parallel data workflows (e.g., environments with 5000+ processors).

Key Responsibilities :

- Design, develop, and maintain scalable ETL pipelines for complex and large datasets.

- Optimize and troubleshoot SQL/PLpgSQL queries in PostgreSQL, ensuring stability in high-concurrency systems.

- Manage and scale data ingestion across multiple sources, ensuring accuracy, integrity, and availability.

- Monitor workflows, resolve bottlenecks, and implement performance tuning strategies.

- Collaborate with architects, analysts, and business stakeholders to deliver data-driven solutions.

- Ensure compliance with data governance, quality standards, and security/privacy requirements.

- Maintain documentation for data models, processes, and system architecture.

Required Skills & Experience :

- 3+ years of ETL development experience with complex, large-scale data workflows.

- Strong expertise in PostgreSQL (including performance optimization at scale).

- Proven ability to manage data processing in massively parallel environments (e.g., 5000+ processor setups).

- Proficiency in SQL, PL/pgSQL, and database performance tuning.

- Hands-on experience with ETL tools (e.g., Talend, Apache NiFi, Informatica, Airflow).

- Good understanding of data modeling, warehousing, and data governance.

- Strong analytical and problem-solving skills with attention to detail.

- Familiarity with big data technologies (Hadoop, Spark, Kafka) is a plus.

Preferred Qualifications :

- Experience working on cloud platforms (AWS, Azure, or GCP).

- Knowledge of DevOps/CI-CD practices for data engineering.

- Exposure to real-time/streaming data systems.

- Proficiency in scripting languages (Python, Bash, etc.).

Education : Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.


The job is for:

May work from home
info-icon

Did you find something suspicious?