Posted on: 26/12/2025
Description :
Responsibilities :
- Architect, implement, and optimize Dremio-based lakehouse environments on cloud (AWS/Azure/Snowflake/Databricks).
- Define ingestion, curation, and semantic modeling strategies for analytics & AI workloads.
- Optimize query performance using reflections, caching, and tuning.
- Integrate diverse data sources (APIs, JDBC, Parquet/Delta, S3/ADLS).
- Establish best practices for security, lineage, and governance.
- Enable self-service analytics with governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for deployment and scaling.
Ideal Candidate :
- Bachelors or Masters in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
Job Specific Criteria :
- CV Attachment is mandatory
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Technical / Solution Architect
Job Code
1594514
Interview Questions for you
View All