Posted on: 03/11/2025
Location : Pune (onsite).
About The Role :
We are looking for an experienced Senior Python Developer (3-6 years) to design and scale offline applications, data workflows, and API integrations.
The role requires strong backend engineering expertise, with hands-on experience in workflow automation, third-party API integration, and data processing pipelines.
Key Responsibilities :
- Design and develop automation pipelines and offline applications that integrate with multiple APIs.
- Build and manage data workflows for ingesting, transforming, and exporting data between systems.
- Develop resilient task execution frameworks using asyncio-based workers.
- Integrate with third-party APIs (REST/GraphQL/SDKs) for payments, ads, reporting, and retail systems.
- Work with relational and non-relational databases like PostgreSQL, MySQL, and MongoDB.
- Implement caching strategies using Redis to optimize workflows.
- Ensure workflows are reliable, fault-tolerant, and recoverable.
- Write and maintain unit, integration, and workflow-level tests.
- Collaborate with DevOps teams to deploy automation pipelines on Docker/Kubernetes.
- Conduct performance and load testing using tools like Locust, JMeter, or k6.
- Participate in architecture discussions, code reviews, and technical planning.
Required Skills & Experience :
- 3-6 years of experience in Python development.
- Strong understanding of computer science fundamentals (BE/BTech or MS in CS/IT).
- Hands-on experience with automation scripts, ETL pipelines, or data workflows.
- Proficiency with task queues and schedulers (Celery, RQ, Airflow, APScheduler).
- Strong API integration skills (REST, GraphQL, SDKs).
- Familiarity with asynchronous programming (asyncio, aiohttp).
- Database experience with PostgreSQL/MySQL/MongoDB, including query optimization.
- Experience with Redis for caching and background jobs.
- Knowledge of error handling, retries, and backoff strategies.
- Experience with Git, CI/CD pipelines, Docker, and Kubernetes.
- Understanding of monitoring tools (Prometheus, Grafana, ELK Stack).
- Cloud experience (AWS, GCP, or Azure) for scalable deployments.
- Experience with pytest or unittest for test-driven development.
- Exposure to message queues (Kafka, RabbitMQ, SQS) and event-driven systems.
Did you find something suspicious?
Posted By
Posted in
Backend Development
Functional Area
Backend Development
Job Code
1568952
Interview Questions for you
View All