HamburgerMenu
hirist

Data Engineer - AWS Platform

Dasro consulting Pvt ltd
5 - 8 Years
Multiple Locations

Posted on: 08/04/2026

Job Description

Key Requirements:

Experience : 5+ years in Data Engineering

Shift Timings : 3 PM to 12:00 Midnight IST

Location : Preference for local candidates who can attend the Tech Round 2 in-person at Noida, Gurugram, or Chandigarh

Work Model : Must be open to 3 days work from office (WFO)

Primary Skills : Strong experience in Data Engineering and AWS

Job Summary :

We are looking for a highly skilled and motivated Data Engineer with strong expertise in AWS data services to join our data platform team. The ideal candidate will have hands-on experience designing scalable data pipelines, workflow orchestration frameworks, and large-scale data migration solutions.

This role will be responsible for building robust cloud-native data engineering solutions on AWS, migrating datasets from legacy systems and data warehouses, and ensuring secure and efficient data processing pipelines across distributed environments.

Key Responsibilities :

AWS Data Pipeline Development :

- Design and implement scalable ETL/ELT data pipelines using AWS Glue, AWS Lambda, and AWS S3.

- Build and maintain high-performance data ingestion frameworks for processing large-scale datasets.

- Implement data pipelines for data warehousing and analytics platforms such as AWS Redshift.

- Optimize storage and querying strategies using AWS S3 data lakes.

Data Workflow Orchestration :

- Develop and maintain data workflow orchestration frameworks using tools such as Apache Airflow or AWS Step Functions.

- Automate complex workflows including data ingestion, transformation, validation, and loading processes.

- Build reusable and configurable workflows to support multiple data processing use cases.

Data Migration & Integration :

- Lead data migrations from legacy data warehouse technologies to modern AWS data platforms.

- Perform data migration from RDBMS systems (e.g., MySQL, SQL Server, Oracle) to AWS S3 or AWS Redshift.

- Design scalable migration frameworks for large datasets with minimal downtime.

- Integrate data sources from enterprise applications and external systems.

Data Security & Governance :

- Implement secure data pipelines using AWS security best practices.

- Manage access control and data governance using AWS IAM and Lake Formation.

- Ensure data encryption, access management, and compliance across all data platforms.

Performance Optimization & Monitoring :

- Monitor data pipelines and troubleshoot performance issues.

- Optimize ETL workflows for scalability, reliability, and cost efficiency.

- Implement logging, monitoring, and alerting mechanisms for data pipelines.

Required Skills & Qualifications (Must Have) :

- 5+ years of experience in Data Engineering or Data Platform development

- Strong hands-on experience with :

1. AWS

2. AWS Glue

3. AWS S3

4. AWS Lambda

- Experience with Data Workflow Orchestration tools such as Apache Airflow or AWS Step Functions

- Experience performing data migrations from other data warehouse technologies

- Experience performing data migrations from RDBMS systems to AWS S3 or AWS Redshift

- Strong expertise in Python and SQL for building scalable data pipelines

- Solid understanding of ETL/ELT concepts, data partitioning, and distributed data processing

- Experience working with version control systems such as GitLab or Bitbucket

- Strong debugging, analytical thinking, and problem-solving skills

- Basic understanding of Object-Oriented Programming concepts

Industry Knowledge & Experience :

- Experience building cloud-native data engineering solutions on AWS

- Experience with data warehouse architectures and large-scale analytics platforms

- Hands-on experience with data extraction, transformation, and migration frameworks

- Experience working in high-volume data environments such as FinTech, analytics platforms, or enterprise data systems

Good to Have Skills :

- IBM Cognos

- AWS Athena

- AWS Lake Formation

- AWS Redshift

- AWS Glue Data Catalog

- AWS SageMaker

- AWS IAM

Soft Skills :

- Strong communication skills to present technical solutions and recommendations to stakeholders

- Ability to work cross-functionally in a fast-paced and evolving environment

- Detail-oriented with a proactive approach to identifying and solving data platform challenges

- Ability to collaborate effectively with data scientists, analysts, and platform engineering teams


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in