HamburgerMenu
hirist

Data Engineer - ETL Pipeline

InCommon
Multiple Locations
3 - 6 Years

Posted on: 23/09/2025

Job Description

We are seeking hands-on Data Engineers with strong data migration experience and exposure to diverse database systems.

Responsibilities :

- Migrate ETL pipelines and plugins from Fivetran into Nexla/Snowflake.

- Build automation for ingestion, transformation, and observability.

- Deliver under tight deadlines in a fast-paced environment.

- This is a technical contributor role ideal for someone who thrives on execution rather than high-level architecture.

- Execute migrations from Fivetran and other legacy connectors into Snowflake/Nexla.

- Design and maintain ETL pipelines using Python, PySpark, DBT, and related tools.

- Build and optimize API-based integrations (authentication, pagination, automation).

- Ensure data quality, consistency, and monitoring across pipelines.

- Contribute to prioritization and phased delivery of ~1,500+ plugins.


Requirements :

- 3-6 years in data engineering or ETL development, with hands-on migration work.

- Strong knowledge of Snowflake (data ingestion, transformation, optimization).

- Proficiency in Python for scripting and automation.

- Experience with PySpark, DBT, and API-driven data integrations.

- Familiarity with AWS/cloud-hosted data solutions and observability.

- Solid track record in ETL Data Engineering transition.


Nice to Have :

- Prior experience with large-scale plugin migrations.

- Familiarity with Fivetran, Nexla, or similar aggregation platforms.

- Ability to balance scripting-level problem-solving with broader data engineering.

- Comfort working in time-sensitive, high-stakes environments.


info-icon

Did you find something suspicious?