HamburgerMenu
hirist

Data Architect - PostgreSQL/MongoDB

Neotas Development Private Limited
Anywhere in India/Multiple Locations
8 - 10 Years

Posted on: 20/08/2025

Job Description

Were Hiring : Data Architect at Neotas

Experience : 8+ Years (with minimum 5 years in an Architectural role)

Type : Full-time


About the Role :


We are seeking an experienced Data Architect to design, scale, and optimize high-performance, enterprise-grade data systems. The ideal candidate will have strong expertise in cloud-native architectures, ETL pipelines, and modern data platforms, with proven experience driving secure, scalable, and efficient database solutions.


Key Responsibilities :


- Design & Architecture : Define data architecture strategies for relational, non-relational, and distributed systems, ensuring scalability, performance, and reliability.

- Database Management : Architect, implement, and optimize PostgreSQL, MongoDB, Elasticsearch, and Graph DB solutions.

- Cloud Solutions : Leverage AWS cloud-native services (RDS, DynamoDB, Redshift, S3, Glue, Lambda, etc.) for secure, high-availability data platforms.

- ETL & Data Pipelines : Design and manage ETL/ELT workflows using Databricks, Snowflake, and Python-based automation.

- Data Governance & Security : Establish best practices around data quality, compliance, access control, and security frameworks.

- Performance Optimization : Implement indexing, partitioning, caching, and sharding strategies for optimal query performance.

- Collaboration : Partner with engineering, product, and business teams to align data solutions with organizational goals.

- Innovation : Stay updated on emerging database technologies, architectural frameworks, and cloud solutions to drive continuous improvement.


Must-Have Skills :


- Deep expertise in Relational & Non-Relational Databases PostgreSQL, MongoDB, Elasticsearch, Graph DB.

- Strong experience with AWS Cloud-Native Databases & Services (RDS, Redshift, DynamoDB, Glue, Lambda, S3).

- Proven track record in building scalable ETL pipelines with Databricks & Snowflake.

- Proficiency in Python for data engineering, automation, and orchestration.

- 5+ years of recent experience in an architectural role, driving enterprise-grade database solutions.


Good-to-Have Skills :


- Certifications in PostgreSQL, MongoDB, or Cloud Database Architecture.

- Familiarity with data mesh, data lakehouse architectures, and real-time streaming pipelines (Kafka, Kinesis, Flink).

- Experience with containerization & orchestration (Docker, Kubernetes) for database scaling.


info-icon

Did you find something suspicious?