HamburgerMenu
hirist

Senior Data Engineer - Python/SQL/ETL

Scoop Technologies Pvt Ltd
Hyderabad
5 - 7 Years
star-icon
4.4white-divider24+ Reviews

Posted on: 12/09/2025

Job Description

Job Title : Senior Data Engineer

Experience : 5 to 7 Years

Location : Hyderabad

Key Responsibilities :

- Design, develop, and maintain robust data pipelines using AWS Glue, PySpark, and Python.

- Build scalable and efficient ETL/ELT workflows and manage metadata using Glue Data Catalog.

- Optimize data models, partitioning strategies, and query performance for large-scale data lakes.

- Implement and maintain cloud-native data solutions using AWS services like S3, Lambda, Aurora PostgreSQL, RDS, IAM, and EC2.

- Develop infrastructure as code using Terraform for reproducible, scalable infrastructure.

- Collaborate with cross-functional teams in an agile environment to deliver high-quality data products.

- Work with modern data lake frameworks including AWS Data Lake, Apache Iceberg, and Lake Formation.

- Ensure data governance, security, and performance tuning across data systems.

- Support CI/CD and DevOps best practices for deploying data solutions.

Required Skills & Qualifications :

- Bachelors degree in Computer Science, Engineering, or a related field.

- 5 to 7 years of experience in Data Engineering or related roles.

- Deep expertise in AWS cloud ecosystem, especially Glue, S3, Lambda, EC2, Aurora PostgreSQL, RDS, and IAM.

- Proficient in Python and PySpark for large-scale data processing.

- Strong understanding of SQL, data modelling, and performance optimization techniques.

- Hands-on experience with Terraform and infrastructure automation.

- Experience with AWS Data Lake, Lake Formation, and Apache Iceberg.

- Good understanding of DevOps principles and experience with CI/CD pipelines.

- Excellent problem-solving skills and ability to work in fast-paced, agile environments.

- Strong verbal and written communication skills.

Good to Have :

- AWS certification (e.g., AWS Certified Data Analytics, Solutions Architect).

- Experience with data quality frameworks or monitoring tools.

- Exposure to streaming data tools like Kinesis or Kafka.


info-icon

Did you find something suspicious?