Posted on: 22/09/2025
We are seeking hands-on Data Engineers with strong data migration experience and exposure to diverse database systems.
Responsibilities :
- Migrate ETL pipelines and plugins from Fivetran into Nexla/Snowflake.
- Build automation for ingestion, transformation, and observability.
- Deliver under tight deadlines in a fast-paced environment.
- This is a technical contributor role ideal for someone who thrives on execution rather than high-level architecture.
- Execute migrations from Fivetran and other legacy connectors into Snowflake/Nexla.
- Design and maintain ETL pipelines using Python, PySpark, DBT, and related tools.
- Build and optimize API-based integrations (authentication, pagination, automation).
- Ensure data quality, consistency, and monitoring across pipelines.
- Contribute to prioritization and phased delivery of ~1,500+ plugins.
- 3-6 years in data engineering or ETL development, with hands-on migration work.
- Strong knowledge of Snowflake (data ingestion, transformation, optimization).
- Proficiency in Python for scripting and automation.
- Experience with PySpark, DBT, and API-driven data integrations.
- Familiarity with AWS/cloud-hosted data solutions and observability.
- Solid track record in ETL Data Engineering transition.
- Prior experience with large-scale plugin migrations.
- Familiarity with Fivetran, Nexla, or similar aggregation platforms.
- Ability to balance scripting-level problem-solving with broader data engineering.
- Comfort working in time-sensitive, high-stakes environments.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1550307
Interview Questions for you
View All