HamburgerMenu
hirist

Job Description

Exp : 5+ years


Location : Remote


Job Type : 6 months contract + ext.


Job Title : Teradata-to-Snowflake Migration Developer (or Data Engineer / Migration Specialist)


Position Overview :


This role leads the end-to-end migration of legacy data warehouse systems (especially Teradata) to Snowflake, ensuring seamless transition, data integrity, and cost effective operations post migration. The developer works collaboratively with stakeholders across data engineering, analytics, and business teams to design, build, validate, and optimize data pipelines and code.


Key Responsibilities :


Migration Planning & Strategy :


- Assess current Teradata systems: schema, tables, views, stored procedures, BTEQ scripts, macros, UDFs, indexes


- Define migration scope: lift and shift vs redesign approaches, prioritize workloads, establish SLAs and rollout timelines.


Schema & Code Conversion :


- Map Teradata schemas/data types to Snowflake equivalents; leverage tools like SnowConvert for SQL and code translation.


- Rewrite or refactor Teradata scripts, macros, stored procedures (e.g. BTEQ) to Snowflake-compatible SQL, JS, or Python procedures.


ETL/ELT Pipeline Development :


- Build and optimize data ingestion pipelines using tools such as Azure Data Factory, dbt, Airflow, Informatica, Talend, Fivetran, Matillion or Snowflake Tasks. Support both batch and incremental loads.


- Implement COPY INTO or Snowpipe where relevant, with automation and orchestration pipelines.


Data Validation & Testing :


- Execute data validation using row counts, checksums, hashing, and content comparison between Teradata and Snowflake environments.


- Perform functional testing, performance testing, and user acceptance (UAT) with stakeholders.


Performance & Cost Optimization :


- Optimize Snowflake workloads with clustering keys, virtual warehouse sizing, query tuning, caching, and warehouse auto's uspend/resume policies.


- Monitor and control compute costs through warehouse tuning and usage patterns.


Architecture & Governance :


- Design Snowflake account structure: databases, schemas, roles, security, RBAC, and access control. Implement data encryption and governance (e.g., masking, role based access).


- Document architecture, data flows, migration plans, rollback strategy, and operational playbooks.


Collaboration & Support :


- Partner with stakeholders across data science, BI, analytics, IT and security teams to ensure integration and alignment.


- Conduct knowledge transfer, end-user training, and documentation on Snowflake best-practices, query patterns, and cost management.


Required Skills & Qualifications :


Experience : 4- 10+ years in data engineering or data platform roles, including 2+ years dedicated to Teradata-to-Snowflake migration projects.


Technical Skills :


- Teradata and Snowflake data warehousing


- SQL (Teradata SQL, SnowSQL), stored procedures, UDFs


- Python (or JavaScript), shell scripting for automation


- ETL/ELT & orchestration tools (e.g. Azure ADF, dbt, Airflow, Talend, Fivetran)


- Data Modeling : Understanding of schema design, normalization, dimensional modeling, and handling large volumes of data.


- Cloud Platforms : Hands-on experience with one or more cloud platforms (AWS, Azure, GCP); building pipelines using Databricks/Azure Data Lake or equivalent.


- Optimization & Governance : Query tuning, warehouse sizing, cost control, security best practices, access control, data masking compliance.


Soft Skills : Excellent collaboration, documentation, problem-solving, and communication; ability to mentor junior engineers.


Preferred Qualifications (Optional) :


- Certifications like SnowPro Core or SnowPro Advanced, Azure/AWS data certifications.


- Experience with CI/CD for data pipelines (e.g. Git, Jenkins, GitLab CI) and version control.


- Familiarity with real-time ingestion tools like Kafka, Pub/Sub, Snowflake Tasks.


- Prior exposure to big data platforms (Spark, Hadoop, Delta Lake, Lakehouse architecture).


- Experience working in Agile/Scrum environments using Jira or Confluence


info-icon

Did you find something suspicious?