HamburgerMenu
hirist

Job Description

Job Title : ETL Developer (AWS, Snowflake, IBM DataStage)

Experience : 5 to 7 Years

Location : India (Remote - Work from Home)

Education : B.Tech in Computer Science or related field

Employment Type : Full-Time

Job Summary :


We are seeking a skilled and experienced ETL Developer with 5 - 7 years of hands-on experience in building and managing ETL pipelines using AWS, Snowflake, and IBM DataStage. Experience with IBM Cloud Pak for Data (CP4D) is a plus. The ideal candidate should have a solid background in data integration, transformation, and cloud data platforms, with a focus on performance, scalability, and security.

Key Responsibilities :


- Design, develop, and maintain robust ETL pipelines using IBM DataStage, AWS Glue, and Snowflake.


- Build scalable data integration solutions to support enterprise-level data warehousing and analytics initiatives.

- Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver effective solutions.

- Optimize and monitor ETL workflows for performance and reliability.

- Develop and maintain data mappings, transformation rules, and data quality checks.

- Ensure secure and efficient data movement across cloud and on-premises environments.

- Document data processes, pipeline structures, and design patterns for future reference.

- Work with version control tools and CI/CD processes for code deployment.

- Engage in troubleshooting and debugging of ETL jobs and data pipeline issues.

Required Skills :

- 5 - 7 years of professional experience in ETL/Data Engineering roles.

- Strong experience with IBM DataStage development and deployment.


- Hands-on experience with AWS services (S3, Glue, Lambda, Redshift, etc.

- Proficient in working with Snowflake - development, performance tuning, and data loading.

- Solid understanding of SQL, stored procedures, and data warehousing concepts.

- Experience with scheduling and orchestration tools (e.g., Control-M, Apache Airflow).

- Familiarity with Git or other version control systems.

- Strong analytical and problem-solving skills.

Good to Have :


- Experience with IBM Cloud Pak for Data (CP4D) platform.

- Knowledge of DevOps practices and CI/CD pipelines for data projects.

- Understanding of data governance, security, and compliance practices.

Benefits :


- Flexible remote work environment.

- Opportunity to work with cutting-edge data technologies.

- Collaborative and growth-oriented team culture.


info-icon

Did you find something suspicious?