HamburgerMenu
hirist

Job Description

Job Title : AWS Data Engineer

Location : Hyderabad (Hybrid)

Experience Required : 6 to 7 Years

Employment Type : Permanent


Job Description :


We are looking for a highly skilled and experienced AWS Data Engineer to join our growing data engineering team. The ideal candidate will have a strong background in building scalable data pipelines, data integration, and hands-on experience with AWS cloud services. You will play a critical role in designing, developing, and optimizing data solutions to support analytics and business intelligence efforts across the organization.


Key Responsibilities :


- Design, develop, and maintain scalable and robust data pipelines using AWS cloud-native tools.

- Build and manage ETL/ELT processes to ingest and transform structured and unstructured data.

- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver reliable datasets.

- Implement best practices for data engineering, including data quality, governance, testing, and security.

- Monitor data workflows and troubleshoot issues across the pipeline lifecycle.

- Work on performance tuning, scalability, and cost optimization of data processes on AWS.

- Create and maintain technical documentation related to data pipelines and infrastructure.

- Contribute to continuous integration and deployment (CI/CD) automation for data solutions.


Required Skills :


- 6 to 7 years of overall experience in data engineering.


- Strong expertise in AWS data services such as AWS Glue, Redshift, S3, Lambda, Step Functions, and Athena.

- Proficiency in Python or Scala for ETL development and data transformation.

- Solid experience with SQL for data manipulation and querying.


- Experience with data lake and data warehouse architecture.

- Good understanding of data modeling concepts and performance tuning.

- Hands-on experience with version control tools like Git and CI/CD pipelines.

- Familiarity with tools like Airflow, DBT, or similar workflow orchestration frameworks is a plus.

- Excellent problem-solving, analytical, and communication skills.


Nice to Have :


- Experience with big data technologies like Spark, Kafka, or Hadoop.


- Exposure to DevOps practices and infrastructure-as-code tools like Terraform or CloudFormation.


- Knowledge of data security, GDPR, and compliance standards.


Educational Qualification :


- Bachelors or Masters degree in Computer Science, Information Technology, Engineering, or a related field.


Why Join Us?


- Work with a dynamic and collaborative team focused on cutting-edge data solutions.

- Opportunity to contribute to high-impact projects in a cloud-first environment.

- Flexible hybrid working model with a long-term career growth path.


info-icon

Did you find something suspicious?