HamburgerMenu
hirist

Senior AWS Data Engineer

NetConnectGlobal
4 - 7 Years
Mumbai

Posted on: 30/03/2026

Job Description

Description :



Role : Senior AWS Data Engineer



Location : Vikroli, Mumbai (The candidate should be staying in Mumbai ((Mira Road to dadar Ulhasnagar to dadar Sion to khaleshwar)

Work mode : 7 days/month WFO

Experience : 4+ years

Notice period : Immediate to 30 days

Schedule : 7 days a month, 10 AM to 7 PM UK time

Role Overview :

As a Senior Data Engineer, you will be instrumental in designing, developing, and maintaining our clients' data pipelines and data lake solutions on AWS. You will collaborate closely with data scientists, analysts, and other engineers to ensure the efficient and reliable flow of data from various sources into our data lake. Your expertise will be crucial in optimizing data processing, ensuring data quality, and enabling advanced analytics and reporting capabilities. This role directly impacts our clients' ability to leverage their data for strategic decision-making and improved business outcomes.

Key Responsibilities :

- Design and implement scalable and reliable data pipelines using AWS Glue, Python, and SQL to ingest, transform, and load data into S3-based data lake zones.

- Develop and maintain data models and schemas optimized for querying and analysis using AWS Athena and Redshift.

- Build and deploy automated data quality checks and monitoring systems to ensure data accuracy and completeness.

- Collaborate with data scientists and analysts to understand their data requirements and provide them with the necessary data infrastructure and tools.

- Implement CI/CD pipelines using Git and other automation tools to ensure the smooth and efficient deployment of data engineering solutions.

- Triage and resolve data-related issues, ensuring minimal disruption to data availability and data-driven processes.

- Develop and maintain AWS Lambda functions for data processing and automation tasks.

- Create and maintain comprehensive documentation for data pipelines, data models, and data engineering processes.

- Design and implement automation testing frameworks to ensure the quality and reliability of data pipelines.

Required Skillset :

- Demonstrated ability to design and implement data pipelines and data lake solutions using AWS services such as S3, Glue, Athena, Redshift, and Lambda.

- Strong proficiency in Python and SQL for data manipulation, transformation, and analysis.

- Proven ability to implement CI/CD pipelines using Git and other automation tools.

- Excellent problem-solving and troubleshooting skills, with the ability to quickly identify and resolve data-related issues.

- Strong communication and collaboration skills, with the ability to effectively communicate technical concepts to both technical and non-technical audiences.

- Bachelor's degree in computer science, Engineering, or a related field.

- Ability to work effectively in a fast-paced, agile environment.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in