Posted on: 28/04/2026
Job Description :
Key Responsibilities :
- Architect and implement complex, multi-step workflows for large-scale data processing
- Design and develop serverless applications using AWS Lambda
- Build and manage orchestration workflows using AWS Step Functions
- Extract, transform, and load (ETL) data from Snowflake and other data sources
- Optimize SQL queries and improve performance within Snowflake environments
- Design and maintain scalable data pipelines ensuring reliability and efficiency
- Work with AWS services such as DynamoDB, S3, IAM, and CloudWatch
- Implement infrastructure as code (IaC) using Terraform
- Perform performance tuning and ensure high availability of systems
- Collaborate with cross-functional teams including data, backend, and DevOps teams
Required Skills & Qualifications :
Experience :
- 5+ years in Python development and data engineering
- Strong proficiency in Python programming
- Advanced expertise in SQL, including query optimization and performance tuning (Snowflake preferred)
- Hands-on experience with AWS services : Lambda, Step Functions, DynamoDB, S3, IAM, CloudWatch
- Experience with Terraform for infrastructure provisioning
- Strong understanding of ETL pipeline design and implementation
- Familiarity with serverless architecture and distributed systems
- Good problem-solving and analytical skills
Preferred Qualifications :
- Experience working with large-scale data systems
- Exposure to full-stack development (frontend/backend integration)
- Knowledge of CI/CD pipelines and DevOps practices
- Experience in Agile/Scrum environments
Education : Bachelors degree in Computer Science, Engineering, or a related field (B.Tech / B.E. / B.Sc / BCA / BA relevant specialization)
Did you find something suspicious?
Posted by
Ankita Rastogi
Senior Manager - HR at Vichara Technology (India) Private Limited
Last Active: 28 Apr 2026
Posted in
Data Analytics & BI
Functional Area
Data Mining / Analysis
Job Code
1631851