HamburgerMenu
hirist

Senior Python Developer - ETL & Data Pipeline Automation

Posted on: 18/09/2025

Job Description

Role overview :

- Proficient in Python, with hands-on experience using Pandas for large dataset processing.

- Solid knowledge of ETL frameworks and pipeline automation strategies.

- Experience with Elasticsearch : indexing, querying, and optimizing for performance.

- Strong command of Hive SQL and working with distributed data systems.

- Experience building APIs using Flask or similar Python frameworks.

- Understanding of data modeling, schema design, and workflow orchestration.

- Strong debugging, analytical, and communication skills.

What would you do here :

- Design, develop, and automate scalable ETL workflows using Python and Pandas.

- Implement and manage data ingestion and transformation processes using Hive SQL for data warehousing.

- Configure, monitor, and optimize Elasticsearch clusters for indexing, searching, and analytics performance.

- Build and maintain RESTful APIs using Flask to facilitate secure data exchange.

- Collaborate with Data Scientists, Analysts, and Engineering teams to support data-driven applications and reporting tools.

- Ensure high standards of data quality, reliability, and integrity throughout the pipeline.

- Monitor performance, debug issues, and tune queries for maximum efficiency.

- Document data flows, pipeline architecture, and automation logic clearly.

Senior Level :

Required Skills & Experience :

- 8-12+ years of experience in professional Python development.

- Proven expertise in designing and building backend systems and APIs at scale.

- Strong experience with LLMs, GenAI integrations, Agentic workflows and AI-enhanced workflows.

- Solid understanding of microservices architecture and distributed systems.

- Proficiency in automated testing and observability best practices.

- Hands-on experience building and maintaining CI/CD pipelines.

- Strong problem-solving, communication, and stakeholder management skills.

Must Have :

- Python, LLD/HLD, AI/LLM, Unit Testing, Code Reviews, CI/CD pipelines, Frameworks like Fast API, Database like Postgres, Monitoring and visualisation tools like Prometheus and Grafana, Team Leading

Key Responsibilities :

System Architecture & Design :

- Architect and develop secure, scalable backend systems using Python.

- Lead high- and low-level design (HLD/LLD) with a focus on modularity, fault tolerance, and API best practices.

- Champion performance optimization and drive observability improvements across services.

AI/LLM Integration :

- Integrate Large Language Models (LLMs) and GenAI-powered features into core applications.

- Build robust systems that mitigate hallucinations and improve AI reliability.

- Leverage agentic frameworks and AI-driven automation for process enhancement

- Practical experience with OpenAI developer ecosystem, including APIs (Completions, Responses, Files) and tools/SDKs (Agents SDK, Web Search).

- Familiarity with agent debugging/monitoring tools (Phoenix, LangSmith, or similar).

Engineering Excellence :

- Define and enforce engineering best practices across testing, code reviews, and quality gates.

- Ensure strong observability through logging, monitoring, and alerting tools.

- Write unit, integration, and functional tests to support high-reliability systems.

DevOps & Deployment :

- Build and manage CI/CD pipelines for automated testing and smooth deployments.

- Ensure deployment processes support rollback, scalability, and system health.

Collaboration & Leadership :

- Collaborate closely with product, data, business, and DevOps teams to align on solution delivery.

- Mentor and support junior engineers, fostering a culture of continuous improvement and technical excellence.

Python & Backend Engineering :

- Strong Python development skills with experience in frameworks like FastAPI

- Proven experience designing REST APIs, microservices, and backend architectures.

- Experience integrating with databases (e.g., PostgreSQL) for real-time and batch workloads.

- Experience working with data pipelines and ETL frameworks (e.g., jobs moving data from GCP - Postgres) and integrating this data into backend services or AI workflows.

info-icon

Did you find something suspicious?