HamburgerMenu
hirist

Reckonsys Tech Labs - Python Developer - Generative AI

Posted on: 18/09/2025

Job Description

Key Responsibilities :

- Python Engineering

- Generative AI & RAG

- MCP (Model Context Protocol)

- Agent-to-Agent (A2A) Workflows

- Production & Observability

Required Skills & Qualifications :

- 37 years professional experience with Python (3.9+).

- Strong knowledge of OOP, async programming, and REST API design.

- Proven hands-on experience with RAG implementations and vector databases (Pinecone, Weaviate, FAISS, Qdrant, Milvus).

- Familiarity with MCP (Model Context Protocol) concepts and hands-on experience with MCP server implementations.

- Understanding of multi-agent workflows and orchestration libraries (LangGraph, AutoGen, CrewAI).

- Proficiency with FastAPI/Django for backend development.

- Comfort with Docker, GitHub Actions, CI/CD pipelines.

- Practical experience with cloud infrastructure (AWS/GCP/Azure).

- Add tracing, logging, and evaluation metrics (PromptFoo, LangSmith, Ragas).

- Optimize for latency, cost, and accuracy in real-world deployments.

- Deploy solutions using Docker, Kubernetes, and cloud platforms (AWS/GCP/Azure).

- Design and implement multi-agent orchestration (e.g., AutoGen, CrewAI, LangGraph).

- Build pipelines for agents to delegate tasks, exchange structured context, and collaborate.

- Add observability, replay, and guardrails to A2A interactions.

- Develop MCP servers to expose tools, resources, and APIs to LLMs.

- Work with FastMCP SDK and design proper tool/resource decorators.

- Ensure MCP servers follow best practices for discoverability, schema compliance, and security.

- Implement RAG pipelines: text preprocessing, embeddings, chunking strategies, retrieval, re-ranking, and evaluation.

- Integrate with LLM APIs (OpenAI, Anthropic, Gemini, Mistral) and open-source models (Llama, MPT, Falcon).

- Handle context-window optimization and fallback strategies for production workloads.

- Build clean, modular, and scalable Python codebases using FastAPI/Django.

- Implement APIs, microservices, and data pipelines to support AI use cases.

Nice-to-Have

- Exposure to AI observability & evaluation (LangSmith, PromptFoo, Ragas).

- Contributions to open-source AI/ML or MCP projects.

- Understanding of compliance/security frameworks (SOC-2, GDPR, HIPAA).

- Prior work with custom embeddings, fine-tuning, or LLMOps stacks.

What We Offer :

- Opportunity to own core AI modules (MCP servers, RAG frameworks, A2A orchestration).

- End-to-end involvement from architecture MVP production rollout.

- A fast-moving, engineering-first culture where experimentation is encouraged.

- Competitive compensation, flexible work setup, and strong career growth.

Location :

- Bangalore (Hybrid) / Remote.

Experience Level :

- 3 7 years.

Compensation :

- Competitive, based on expertise.


info-icon

Did you find something suspicious?