HamburgerMenu
hirist

Artificial Intelligence/Machine Learning Engineer - Python

ZIBTEK PRIVATE LIMITED
Bangalore
4 - 6 Years
star-icon
4.5white-divider22+ Reviews

Posted on: 06/01/2026

Job Description

Description :


We're looking for a passionate AI/ML Engineer with 4+ years of experience who can bridge the gap between machine learning models and production-ready APIs using Node.js and FastAPI. In this role, you'll train, fine-tune, and deploy ML models; integrate them with scalable backend systems; and build intelligent, data-driven applications using the latest in GenAI, including Hugging Face, OpenAI, LangChain, and more.


Responsibilities :


- Design, train, and deploy machine learning and deep learning models for NLP, vision, or recommendation systems.


- Develop robust APIs using Node.js (Express/Nest.js ) and Python FastAPI for serving AI/ML models.


- Fine-tune and serve Hugging Face Transformers and LLMs(BERT, GPT, Llama, etc. ).


- Build data ingestion and preprocessing pipelines using Python, Pandas, and FastAPI.


- Integrate LLMs and AI agents using frameworks such as LangChain, LlamaIndex, or OpenAI API.


- Implement MLOps workflows including model tracking, CI/CD, and monitoring with tools like MLflow or DVC.


- Deploy and scale models using Docker, Kubernetes, AWS/GCP/Azure, or serverless architectures.


- Collaborate with cross-functional teams (data, frontend, product) to build and ship AI-driven features.


Requirements :


- Backend : Node.js (Express/Nest.js ), FastAPI.


- ML/AI : Python, TensorFlow, PyTorch, scikit-learn.


- AI/ML Tools : Hugging Face Transformers, LangChain, OpenAI API, LlamaIndex.


- Data Handling : Pandas, NumPy, SQL/NoSQL databases.


- DevOps/MLOps : Docker, Kubernetes, Git, MLflow, DVC.


- API Development : REST, GraphQL, WebSocket.


- Cloud Deployment : AWS (SageMaker/Lambda), GCP (Vertex AI), Azure ML.


- Strong understanding of LLM architecture, embeddings, and vector databases such as Pinecone, FAISS, or Milvus.


- Experience with TypeScript for Node.js backend.


- Worked on chatbots, RAG (Retrieval-Augmented Generation)systems, or AI assistants.


- Familiarity with FastAPI async endpoints for real-time inference.


- Exposure to model quantisation optimisation for faster inference.


- Experience integrating FastAPI microservices into Node.js ecosystems.


- B. Tech / M. Tech / MCA in Computer Science, Artificial Intelligence, or a related field.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in