Posted on: 17/09/2025
We are looking for a highly skilled Data Engineer with strong expertise in PostgreSQL and hands-on experience in building data warehouses and data lakes.
In this role, you will design and implement a scalable PostgreSQL-based data warehouse and manage a data lake on AWS infrastructure, leveraging primarily open-source technologies.
The ideal candidate has a strong foundation in SQL engineering, cloud data architectures, and modern data pipelines.
Experience with hosted data warehouse platforms such as DBT, Snowflake, or Databricks is a plus.
Key Responsibilities :
Data Engineering & Architecture :
- Design, build, and maintain a PostgreSQL-based data warehouse for scalable analytics and reporting.
- Develop and manage data pipelines (batch and streaming) for ingestion, transformation, and storage.
- Architect and maintain a data lake on AWS infrastructure (e.g., S3, Glue, Athena, Redshift Spectrum).
- Optimize queries, indexing, and schema design for performance and scalability in PostgreSQL.
- Ensure solutions align with modern open-source data engineering best practices.
Collaboration & Delivery :
- Work closely with product, analytics, and engineering teams to deliver high-quality, reliable data solutions.
- Translate business and analytics requirements into scalable data models and pipelines.
- Provide technical expertise to support data-driven decision-making across the organization.
Data Quality & Governance :
- Implement data quality checks, lineage tracking, and metadata management.
Future-Readiness :
- Stay current with modern data engineering tools and frameworks.
Required Qualifications :
- Expertise in PostgreSQL: advanced SQL, query optimization, schema design, and data modeling.
- Strong experience building ETL/ELT pipelines using Python or open-source frameworks (e.g., Airflow, dbt).
- Proficiency with AWS services (S3, Glue, Athena, Redshift Spectrum, Lambda, etc.).
- Experience managing structured and unstructured data at scale.
- Solid programming skills in Python (or similar).
- Familiarity with open-source data frameworks (e.g., Apache Spark, Kafka).
Preferred Qualifications :
- Experience with infrastructure-as-code tools (Terraform, CloudFormation).
- Knowledge of data governance and cataloging tools.
- Cloud certifications (e.g., AWS Data Analytics Specialty).
- Experience in regulated industries (healthcare, finance, life sciences).
What We Offer :
- Opportunity to design and own enterprise-level data platforms with modern tech.
- Flexibility to innovate and work with open-source-first approaches.
- Remote-first culture with growth opportunities in data architecture and leadership.
Did you find something suspicious?
Posted By
Sucheta S
Talent Acquisition Manager at AQB Solutions Pvt Ltd
Last Active: NA as recruiter has posted this job through third party tool.
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1547512
Interview Questions for you
View All