HamburgerMenu
hirist

Data Integration Engineer - Apache NiFi

Posted on: 03/12/2025

Job Description

Description :


Position : NiFi Developer


Experience : 4+ Years


Location : Chennai


Employment Type : Full-Time


Job Summary :


We are seeking an experienced NiFi Developer with a strong background in data integration, data flow design, and data modeling. The ideal candidate will be responsible for building, optimizing, and maintaining robust data pipelines using Apache NiFi to ensure efficient data movement and transformation across systems.


Key Responsibilities :


- Design, develop, and implement complex data flows using Apache NiFi.


- Integrate data from multiple sources and formats into centralized systems or data lakes.


- Build scalable and reusable NiFi templates and processors for data ingestion and transformation.


- Perform data modeling to support efficient data storage, retrieval, and analytics.


- Collaborate with cross-functional teams (Data Engineers, Architects, and Analysts) to define data flow and integration strategies.


- Optimize performance and ensure high availability of data pipelines.


- Troubleshoot and resolve data flow or pipeline issues in a timely manner.


- Maintain clear documentation of data flow processes and configurations.


Required Skills & Qualifications :


- 4+ years of hands-on experience in developing data pipelines using Apache NiFi.


- Strong understanding of data integration concepts, ETL processes, and data flow design.


- Proficiency in data modeling (conceptual, logical, and physical).


- Experience with relational and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB).


- Knowledge of REST APIs, JSON, XML, and data serialization formats.


- Familiarity with cloud platforms (AWS, Azure, or GCP) and their data services is a plus.


- Strong problem-solving and analytical skills.


- Excellent communication and collaboration abilities.


Preferred Skills :


- Experience with NiFi Registry and NiFi Cluster configuration.


- Exposure to other big data technologies such as Kafka, Spark, or Hadoop.


- Basic scripting experience (Python, Shell, or Groovy).


Mandatory skill :


- Data modelling.


info-icon

Did you find something suspicious?