Posted on: 30/09/2025
Roles & Responsibilities :
- Build and optimize data workflows on AWS cloud platforms ensuring scalability and performance.
- Develop and manage data integration between Snowflake and various data sources.
- Write and optimize complex SQL queries for data extraction, transformation, and analysis.
- Ensure data quality, integrity, and governance across pipelines and storage systems.
- Collaborate with data analysts, data scientists, and business teams to deliver reliable datasets.
- Monitor, troubleshoot, and improve performance of ETL jobs and data pipelines.
- Implement best practices for cost optimization, security, and compliance in AWS data solutions.
- Contribute to architecture discussions and recommend improvements for data platforms.
Requirements :
- Strong hands-on expertise in AWS Glue, PySpark, and ETL development.
- Advanced SQL programming skills with ability to write complex queries.
- Strong knowledge of AWS services (S3, Lambda, Athena, Redshift, IAM, etc.).
- Experience in data modeling, schema design, and data pipeline orchestration.
The job is for:
Did you find something suspicious?
Posted By
Naveen
HR - Associate at DECISION POINT PRIVATE LIMITED
Last Active: NA as recruiter has posted this job through third party tool.
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1554547
Interview Questions for you
View All