Posted on: 15/12/2025
Data Engineer (AWS, ETL, GCP)
Exp : 5+ Years
Job location: Pune/ Indore/Kolkata / Bangalore
NP: Immediate Joiner
Description :
- Strong experience with any ETL tool like Talend or SSIS or Informatica etc.
- Google Cloud Platform (Google cloud bucket, Google cloud fusion, Google Big Query, Google Analytics)
- Ability to lead projects individually and deliver them on time
- Strong experience in performance-tuning techniques
- Experience with Realtime streaming implementation and architecture is a bonus
- Experience building reports and data visualization with any BI tools like Tableau, Power BI, etc.
- Strong foundation in data organization and experience with ETL/ELT processes
- Strong experience designing resilient data pipelines
- Need to own problems from end to end so that you can best collect, extract, and clean the data
- Help to implement maintenance strategy for all datasets
- Strong experience with AWS & GCP technologies (S3, SQS, Redshift, Kinesis, Google Cloud bucket, cloud data fusion, Google Big Query)
- Working knowledge of Java/Python and respective build and packaging systems
- Experience with NoSQL databases like MongoDB, DynamoDB, Druid, Hive, Presto, etc.
- Amazon Web Services (S3, SQS, Redshift, DocumentDB, etc.)
- Experience with Python
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1590279
Interview Questions for you
View All