Posted on: 08/08/2025
Job Description :
Key Responsibilities :
- ETL/ELT Pipeline Development : Architect, build, and optimize complex ETL/ELT processes to ingest, transform, and load data from diverse sources.
- Cloud Platform Expertise : Leverage extensive experience with cloud platforms, particularly AWS, to build and manage data infrastructure and services.
- Data Modeling : Develop and implement robust data modeling strategies to ensure data integrity and efficient querying.
- Big Data Technologies : Utilize big data technologies and distributed computing frameworks to process and analyze large volumes of data.
- API Integration : Work with various APIs to integrate external data sources into our data ecosystem.
- Performance Optimization : Tune database queries and optimize data pipelines to improve performance and efficiency.
- Cross-functional Collaboration : Work closely with data scientists, analysts, and software engineers to understand data requirements and deliver impactful solutions.
Required Skills & Qualifications :
- Strong proficiency in Python and SQL.
- Deep expertise in data warehousing concepts and implementation.
- Hands-on experience in building and optimizing ETL processes.
- Proven experience with cloud platforms, with a strong focus on AWS.
- Experience with big data technologies (e.g., Apache Spark, Hadoop).
- Solid understanding of data modeling principles.
- Experience working with APIs for data integration.
- Excellent communication skills for effective collaboration and stakeholder interaction.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1526874
Interview Questions for you
View All