Posted on: 10/12/2025
Description :
We are in the process of helping our client in identifying Data Architecture Specialist (Greenplum , Snowflake Migration) a Full Time role in Chennai and it is Work from Home
Below find details about the role and reply with your updated resume to take it forward for submission
Data Architecture Specialist (Greenplum , Snowflake Migration)
About the Opportunity :
We are seeking an Architect-level Data Engineering & Cloud Migration expert to lead and execute a large-scale migration from Greenplum to Snowflake. This role requires hands-on architectural depth, strong ownership, and the ability to build highly optimized, scalable, and cloud-native data ecosystems.
You will work closely with stakeholders, engineering teams, and business owners to design, lead, and implement a structured migration roadmap ensuring performance, data quality, and stability during and after the transition.
Job Type : Full Time
Job Positions : 2
Location : Chennai (Work from Home)
Job Description :
- 10-15+ Years (Architect / Senior Data Engineer / Migration Specialist).
- Strong hands-on experience in Greenplum : SQL, performance tuning, optimizer behaviour, storage models, UDFs, ETL frameworks, MPP architecture.
- Deep Snowflake knowledge including :
1. Query performance tuning
2. Virtual warehouses & resource optimization
3. Data sharing, cloning, time travel, fail-safe-Snowpipe, stages, streams, tasks, security integration
- Expert in SQL, Data Modeling, and MPP systems.
- Strong experience in data migration projects, preferably large-scale enterprise workloads.
- Solid understanding of Python, ETL/ELT tools, dbt, Airflow, or similar orchestration frameworks.
- Ability to design scalable enterprise-grade data architectures.
- Experience leading full-cycle migration programs.
- Strong documentation, communication, and client-handling capability.
- Ability to work independently and provide leadership-level guidance.
Key Responsibilities :
1. Architecture & Strategy :
- Define the end-to-end migration blueprint from Greenplum to Snowflake.
- Conduct architectural assessment of existing Greenplum workloads, schemas, pipelines, UDFs, and dependencies.
- Create detailed migration plans including workload rationalization, performance baselining, and optimization strategies.
- Recommend best practices related to compute, storage, cost modeling, performance tuning, and Snowflake design patterns.
2. Migration Execution :
- Lead migration of datasets, ETL pipelines, stored procedures, and analytics workloads.
- Re-engineer SQL/PLSQL code, ETL logic, and complex transformations for Snowflake compatibility.
- Implement Snowflake objects : warehouses, databases, schemas, stages, streams, tasks, roles, and RBAC.
- Validate data accuracy, quality, completeness, and performance post-migration.
3. Performance Tuning & Optimization :
- Optimize Snowflake compute usage and cost efficiency.
- Resolve performance bottlenecks in SQL, ingestion frameworks, and query execution layers.
- Apply design principles for partitioning, clustering, materialized views, micro-partitioning optimization, and caching.
4. Collaboration & Stakeholder Management :
- Work cross-functionally with engineering, BI, analytics, and product teams to ensure smooth migration.
- Document architecture, migration steps, and operational guidelines.
- Provide guidance, best practices, and upskilling support to internal teams.
Must -Have- Skills :
Greenplum (SQL, tuning, MPP concepts) , Snowflake architecture, Migration Experience (Greenplum ? Snowflake preferred), SQL optimization & query rewriting , Python/dbt/Airflow, Data Modeling, AWS/Azure
Academic :
- Post Graduate /Graduate in Engineering /Technology
The job is for:
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1587762
Interview Questions for you
View All