HamburgerMenu
hirist

Blute Technologies - Senior Data Engineer - ETL/Python/SQL

Blute Technologies Private Limited
Multiple Locations
8 - 10 Years

Posted on: 09/07/2025

Job Description

Job Type : Contract 2 Hire position

Job Summary :

We are seeking a skilled Data Engineer with 6-8 years of experience in designing, building, and maintaining scalable data pipelines and ETL workflows. The ideal candidate will have hands-on expertise in Apache NiFi, KNIME, and other ETL tools, and will play a key role in ensuring data availability, quality, and reliability across the organization.

Key Responsibilities :

- Design, develop, and maintain robust ETL pipelines using Apache NiFi, KNIME, and other tools.

- Integrate data from various sources including APIs, databases, flat files, and cloud storage.

- Optimize data workflows for performance, scalability, and reliability.

- Collaborate with data analysts, data scientists, and business stakeholders to understand data needs.

- Monitor and troubleshoot data pipeline issues and ensure data quality and consistency.

- Implement data governance, lineage, and documentation practices.

- Work with cloud platforms (AWS, Azure, or GCP) for data storage and processing.

Required Skills & Qualifications :

- Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.

- 5 years of experience in data engineering or a similar role.

- Strong hands-on experience with Apache NiFi and KNIME for data integration and transformation.

- Proficiency in SQL and scripting languages like Python or Shell.

- Experience with relational and NoSQL databases (e.g., PostgreSQL, MongoDB).

- Familiarity with data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery).

- Understanding of data modeling, data quality, and data governance principles.

Preferred :

- Experience with workflow orchestration tools like Apache Airflow or Luigi.

- Exposure to CI/CD practices and version control (e.g., Git).

- Knowledge of real-time data processing frameworks (e.g., Kafka, Spark Streaming).|

Mandatory Skills :


- ETL pipelines using Apache NiFi, KNIME , ETL , DWH , Python , SQL

- Work with cloud platforms (AWS/ Azure/ GCP/Snowflake ) for data storage and processing.

info-icon

Did you find something suspicious?