Posted on: 22/04/2026
Company Overview :
StatusNeo Technology Consulting Pvt. Ltd. is a leading technology consulting firm specializing in data engineering and cloud solutions.
We empower businesses across various sectors to unlock the value of their data through cutting-edge technologies and innovative strategies.
Our expertise lies in building robust data pipelines, implementing advanced analytics solutions, and providing strategic guidance to drive data-driven decision-making.
Role Overview :
As a Data Engineer at StatusNeo, you will be instrumental in designing, developing, and maintaining scalable data pipelines and infrastructure for our clients.
You will collaborate closely with data scientists, analysts, and other engineers to understand data requirements, build efficient ETL processes, and ensure data quality.
Your work will directly impact our clients' ability to gain actionable insights from their data, enabling them to improve business outcomes and stay ahead of the competition.
Key Responsibilities :
- Design and implement robust and scalable ETL processes to ingest, transform, and load data from various sources into data lakes and data warehouses.
- Develop and maintain data pipelines using Python and PySpark for efficient data processing and transformation.
- Build and optimize database schemas for Snowflake DB to ensure data integrity and performance.
- Implement Medallion Architecture principles to organize and manage data within the data lake.
- Develop and deploy serverless functions using AWS Lambda for real-time data processing and integration.
- Collaborate with data scientists and analysts to understand data requirements and provide data solutions that meet their needs.
- Monitor and troubleshoot data pipelines to ensure data quality and availability.
- Contribute to the development of data engineering best practices and standards.
Required Skillset :
- Demonstrated ability to design and implement ETL processes and data pipelines using Python and PySpark.
- Proven expertise in working with data lakes and data warehouses, including Snowflake DB.
- Strong understanding of database schema design and optimization.
- Experience with cloud platforms such as Azure Databricks and AWS Lambda.
- Ability to apply Medallion Architecture principles to data management.
- Excellent SQL skills for data querying and manipulation.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Ability to adapt to new technologies and learn quickly.
- Bachelor's degree in Computer Science, Engineering, or a related field.
Did you find something suspicious?
Posted by
Mohammed Rawoof
Sr. Talent Analyst at StatusNeo Technology Consulting Pvt. Ltd
Last Active: 24 Apr 2026
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1630551