HamburgerMenu
hirist

Job Description

Job Title : Big Data Nifi Developer


Location : Pune (Hybrid)


Experience : 3 to 5 Years


Work Mode : Hybrid (2-3 days from client office, rest remote)


Job Description :


We are seeking a highly skilled and motivated Big Data NiFi Developer to join our growing data engineering team in Pune. The ideal candidate will have hands-on experience with Apache NiFi, strong understanding of big data technologies, and a background in data warehousing or ETL processes. If you are passionate about working with high-volume data pipelines and building scalable data integration solutions, wed love to hear from you.


Key Responsibilities :


- Design, develop, and maintain data flow pipelines using Apache NiFi.


- Integrate and process large volumes of data from diverse sources using Spark and NiFi workflows.


- Collaborate with data engineers and analysts to transform business requirements into data solutions.


- Write reusable, testable, and efficient code in Python or Java or Scala.


- Develop and optimize ETL/ELT pipelines for performance and scalability.


- Ensure data quality, consistency, and integrity across systems.


- Participate in code reviews, unit testing, and documentation.


- Monitor and troubleshoot production data workflows and resolve issues proactively.


Skills & Qualifications :


- 3 to 5 years of hands-on experience in Big Data development.


- Strong experience with Apache NiFi for data ingestion and transformation.


- Proficient in at least one programming language : Python, Scala, or Java.


- Experience with Apache Spark for distributed data processing.


- Solid understanding of Data Warehousing concepts and ETL tools/processes.


- Experience working with large datasets, batch and streaming data processing.


- Knowledge of Hadoop ecosystem and cloud platforms (AWS, Azure, or GCP) is a plus.


- Excellent problem-solving and communication skills.


- Ability to work independently in a hybrid work environment.


Nice To Have :


- Experience with NiFi registry and version control integration.


- Familiarity with containerization tools (Docker/Kubernetes).


- Exposure to real-time data streaming tools like Kafka.


info-icon

Did you find something suspicious?