HamburgerMenu
hirist

Senior ETL Developer - Big Data Tools

Recruitmenthub365
Gurgaon/Gurugram
3 - 5 Years

Posted on: 24/09/2025

Job Description

Job Summary :


We are looking for a highly skilled ETL Developer with extensive experience in managing complex data flows, designing scalable pipelines, and optimizing performance in large data environments. The ideal candidate should have hands-on experience with ETL tools, strong knowledge of PostgreSQL, and the capability to manage environments with thousands of processors (e.g., 5000+ in a single PG setup).

Key Responsibilities :


- Design, build, and maintain robust ETL pipelines for large-scale and complex datasets

- Develop, optimize, and troubleshoot SQL queries in PostgreSQL, including working with high-concurrency environments

- Work with 5000+ processor instances in PostgreSQL or similar scale setups

- Manage data ingestion from multiple sources, ensuring data integrity, consistency, and availability

- Monitor data workflows, identify bottlenecks, and apply performance tuning

- Collaborate with data architects, analysts, and stakeholders to define and fulfill data requirements

- Ensure data quality, validation, and reconciliation across systems

- Create and maintain documentation for data processes, models, and architecture

- Ensure ETL pipelines meet security, privacy, and compliance standards

Required Skills & Experience :


- 3+ years of experience in ETL development and complex data workflows

- Strong hands-on experience with PostgreSQL, including optimization at scale

- Proven ability to manage and process data across massively parallel systems (e.g., 5000 processor environments)

- Proficient in SQL, PL/pgSQL, and performance tuning

- Experience with ETL tools like Talend, Apache Nifi, Informatica, Airflow, etc.

- Familiarity with big data ecosystems (Hadoop, Spark, Kafka) is a plus

- Strong understanding of data modeling, warehousing, and data governance

- Excellent analytical, debugging, and problem-solving skills

Preferred Qualifications :


- Experience in cloud platforms (AWS, GCP, or Azure)

- Familiarity with DevOps and CI/CD practices for data pipelines

- Exposure to real-time streaming data processing

- Knowledge of scripting languages (Python, Bash, etc.)

Education :


- Bachelors or Masters degree in Computer Science, Data Engineering, or related field


info-icon

Did you find something suspicious?