HamburgerMenu
hirist

Senior Data Engineer - Python/Snowflake

Posted on: 20/11/2025

Job Description

Description :


Key Responsibilities :


- Design, build, and maintain scalable data pipelines for structured, semi-structured, and unstructured data.


- Ingest data into Snowflake from multiple sources (databases, APIs, files, streaming data, cloud storage, Saas Apps).


- Implement ETL/ELT processes using SQL, Python, Scripts, Airflow, Fivetran, or other open-source relevant tools.


- Collaborate with broader data engineering / Data Science / AI /ML teams to prepare datasets and establish data pipelines for more than 60 systems.


- Ensure data quality, integrity, and governance across data pipelines.


- Work with business and data science teams to understand data requirements and translate them into technical solutions.


- Maintain documentation, monitoring, and alerting setup for pipelines.


- Own end to end responsibilities for all pipelines.


- Mentor junior developers, perform code reviews, quality checks, and act as leader for data pipelines at offshore.


Mandatory Experience and Skills :


- Hands-on experience in building data pipelines for structure and unstructured data.


- High Proficiency in Python language for data extraction, data transformation and data load.


- Hands on experience with pipeline orchestration tools such as Airflow, Dagster , Prefect or similar tech, (airflow/dagster preferred)


- Hands-on experience with ETL/ELT tools (Fivetran, Apache Nifi, AirByte, or equivalent other open-source toolsets) is a plus.


- Hands on experience with consuming APIs for data extraction, json/jsonl/xml/csv/avro, parquet data validation, deduplication, transformation, load to cloud Datawarehouse.


- Familiarity with CDC (change data capture) mechanisms for databases.


- Familiarity with cloud storage and services (AWS S3, Azure Blob, GCP Cloud Storage).


- Knowledge of data governance, DLP, security, and compliance, API security (OAuth2 / JWT / API keys / Basic Auth / mTLS.), data encryption at rest and in transit.


- Experience with streaming or real-time ingestion (Kafka, EventHub) is a plus.


- Strong problem-solving skills and ability to work in cross-functional teams.


- Strong communication skills with ability to participate and lead technical discussions.


- Ability to learn and master new technology as needed by the team.


Good to Have :


- Experience with Snowflake (tables, streams, tasks, stages, Snowpipe).


- Experience with working in data pipelines for AI/ML/vector DB data ingestion.


info-icon

Did you find something suspicious?