HamburgerMenu
hirist

ETL Developer

CSN Global IT Solutions
Multiple Locations
5 - 8 Years

Posted on: 31/07/2025

Job Description

Job Description :

Role : Amazon ETL Developer / AWS Cloud ETL developer L5

Location : PAN India

Job Summary :

We are looking for 5+ experience Amazon ETL Developer responsible for designing, developing, and maintaining ETL (Extract, Transform, Load) processes to manage data pipelines in Amazon Web Services (AWS) .

Key Responsibilities and Technical Skills :

Technical Skills :

- ETL Tools : Talend (Nice to have)

- Database : Snowflake, Oracle, Amazon RDS (Aurora, Postgres) , DB2, SQL server and Casandra

- Big Data and Amazon Services : Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR , Amazon MSK, Amazon Sagemaker, Apache Spark

- Data Modeling Tools : Archimate (not mandated- secondary/preferred), Erwin , Oracle Data Modeler (secondary/preferred)

- Scheduling Tools : Autosys, SFTP, AirFlow (preferred. This should not be an issue, any resource can learn how to use it)

Key Responsibilities :

- Designing, building, and automating ETL processes using AWS services like Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker Apache Spark.

- Developing and maintaining data pipelines to move and transform data from diverse sources into data warehouses or data lakes.

- Ensuring data quality and integrity through validation, cleansing, and monitoring ETL processes.

- Optimizing ETL workflows for performance, scalability, and cost efficiency within the AWS environment.

- Troubleshooting and resolving issues related to data processing and ETL workflows.

- Implementing and maintaining security measures and compliance standards for data pipelines and infrastructure.

- Documenting ETL processes, data mappings, and system architecture.

- Implementing security measures such as IAM roles and access controls.

- Diagnosing and resolving issues related to AWS services, infrastructure, and applications.

- Proficiency in Big data tool and AWS services : Including Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker, Apache Spark relevant to data storage and processing.

- Strong SQL skills : For querying databases and manipulating data during the transformation process.

- Programming and scripting proficiency : Primarily Python, for automating tasks, developing custom transformations, and interacting with AWS services via SDKs and APIs.

- Data warehousing and modeling expertise : Understanding data warehousing concepts, dimensional modeling, and schema design to optimize data storage and retrieval.

- Good to have Experience with ETL tools and technologies : Talend

- Data quality management skills : Ensuring data accuracy, completeness, and consistency throughout the ETL process.

- Familiarity with DevOps practices : Including CI/CD pipelines and infrastructure as code

- Experience in Insurance domain


info-icon

Did you find something suspicious?