Posted on: 25/11/2025
Description :
- Develop high-performance data pipelines for real-time and batch processing.
- Collaborate with cross-functional teams to understand project requirements and translate them into technical specifications.
- Design and implement complex, scalable, and reliable systems that meet business requirements.
- Troubleshoot, debug, and optimize code to ensure performance and reliability.
- Stay up-to-date with the latest industry trends, tools, and technologies.
- Work closely with DevOps to deploy and maintain applications using Docker and Kubernetes.
- Utilize cloud platforms such as AWS to build, deploy, and scale applications.
- Integrate and utilize various cloud services to enhance system functionality.
Requirements :
- Experience in Bigdata processing using any of Snowflake, Redshift, Spark, Flink, Iceberg, Kafka etc.
- Proficient in DSA and advanced Java/Python programming skills.
- Expert in Application development cloud/on premise end to end.
- Middle layer, DB layer.
- Should have worked for cloud based applications, built on microservices based paradigms, achieving horizontal scaling.
- Good hands on and understanding of RMQ, Elasticsearch, SQL, NoSQL DB (e.g. mongoDB etc), K8s, Kafka streams, etc.
- Good hands on in Complex Event Processing systems.
- Good to have solved scale and performance issues with respect to cloud applications.
- Experience of debugging applications running on Unix like systems (e.g. Ubuntu, CentOS).
- Experience developing RESTful APIs for complex data sets.
- Knowledge of container based development & deployment (e.g. Dockers, rkt).
- Exposure to AWS, Google Cloud Platform, Microsoft Azure, etc.
- Expertise in software security domain, a plus.
Did you find something suspicious?
Posted By
Posted in
Backend Development
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1580094
Interview Questions for you
View All