Posted on: 29/04/2026
Description :
Duties & Responsibilities :
- Lead the design, development, and optimization of ETL/ELT pipelines and workflows
- Lead and contribute to API design and development, including RESTful services used by both first-party and third-party consumers
- Design, develop, and maintain Java Spring Bootbased microservices with a strong focus on scalability, reliability, and performance
- Write, tune, and optimize advanced SQL queries for large-scale data processing
- Build and maintain data warehouses and data models (relational, dimensional, star schema)
- Develop full-stack applications, including new capabilities and features, for both on-premises and cloud-based applications.
- Utilize modern programming languages to develop scalable, secure, and high-performance applications that align with business needs.
- Translate business requirements into robust, well-architected, and reusable data solutions
- Partner with analysts, data scientists, and stakeholders to deliver trusted, actionable datasets
- Implement and maintain unit tests, integration tests, and validation frameworks to ensure pipeline reliability
- Document workflows, and design decisions to support knowledge sharing and operational continuity
- Apply coding standards, CI/CD practices, version control, and peer code reviews to ensure high-quality deliverables
- Proactively monitor, optimize, and troubleshoot pipelines for performance, scalability, and cost efficiency
- Support deployments and handle post-production monitoring and incident resolution
- Mentor associate engineers, providing technical guidance and feedback
Requirements :
Minimum Years of Experience :
- 5+ Yrs - Data Engineering
- 3+ Yrs - Snowflake or any cloud Data warehouse
- 2+ Yrs - RESTful APIs, Java Springboot, API Integration
- 1+ Yrs - Python
Basic Qualifications :
- Bachelors degree in computer science, Information Systems, Engineering, or a related field
- 47 years of hands-on experience in data engineering or related fields
- 3+ years of hands-on experience in developing RESTful APIs and microservices using Spring Boot, and in Node.js development, including API integration and service orchestration.
- Proficiency in batch processing, job scheduling, and performance tuning with Spring Batch.
- Expertise in writing complex SQL queries, optimizing, using advanced functional and turning the performance on cloud data warehouse environment.
- Strong experience with data warehousing concepts, design, and implementation
- Minimum 3 years of hands-on experience with Snowflake or modern cloud data warehouse
- Minimum 2 years of hands-on experience in data modeling
- Minimum 1 year of hands-on Python development (ETL/ELT scripting, OOP, automation)
- Strong knowledge of at least one major cloud platform (AWS, Azure, or GCP)
- Experience with orchestration tools (e.g., Airflow, ADF, Luigi) for workflow management
- Demonstrated ability to lead and mentor teams while managing multiple priorities
- Strong communication and stakeholder management skills
- Strong hands-on experience with data warehousing concepts, design, and implementation
Preferred Qualifications :
- Hands-on experience with DBT (Data Build Tool) for data transformations
- Familiarity with DevOps and CI/CD best practices in data engineering
- Exposure with real-time/streaming platforms (Kafka, Spark Streaming, Flink)
- Exposure to the e-commerce domain or large-scale B2B/B2C environments
- Has the ability to break down problems and estimate time for development tasks
- Experience mentoring and guiding junior engineers
- Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team
- Learns organization vision statement and decision-making framework. Able to understand how team and personal goals/objectives contribute to the organization vision
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1632335