Posted on: 24/02/2026
Job Summary :
We are looking for an experienced PostgreSQL Data Engineer to design and manage high-volume data pipelines and optimize large-scale datasets. The ideal candidate should have strong expertise in performance tuning, ETL processes, and handling high-throughput systems.
Key Responsibilities :
- Design and manage high-volume data ingestion pipelines using PostgreSQL
- Handle large datasets (millions to billions of records) with high performance
- Develop and maintain optimized Stored Procedures (PL/pgSQL), functions, and triggers
- Perform query tuning, indexing, partitioning, and vacuum strategies
- Build ETL/ELT pipelines from files (CSV/XLSX), APIs, and message queues
- Ensure data integrity, consistency, and fault-tolerant processing
- Collaborate with backend and analytics teams for reporting and real-time use cases
- Monitor database performance and proactively resolve bottlenecks
Required Skills & Experience :
- 3- 7 years of strong hands-on PostgreSQL experience
- Proven experience with large datasets and high-throughput systems
- Expertise in query optimization, indexes, partitions, and execution plans
- Strong knowledge of PL/pgSQL, stored procedures, and triggers
- Experience in batch and streaming data pipelines
- Familiarity with cloud environments (AWS / GCP), backups, and replication
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1615525