Posted on: 06/08/2025
Job Description :
Key Responsibilities :
- Design and implement scalable data pipelines using ETL/ELT frameworks.
- Develop and maintain data models and data warehouse architecture using Snowflake.
- Build and manage DBT (Data Build Tool) models for data transformation and lineage tracking.
- Write efficient and reusable Python scripts for data ingestion, transformation, and automation.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
- Ensure data quality, integrity, and governance across all data platforms.
- Monitor and optimize performance of data pipelines and queries.
- Implement best practices for data engineering, including version control, testing, and CI/CD.
Required Skills and Qualifications:
- 8+ years of experience in data engineering or a related field.
- Strong expertise in Snowflake including schema design, performance tuning, and security.
- Proficiency in Python for data manipulation and automation.
- Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.
- Experience with DBT for data transformation and documentation.
- Hands-on experience with ETL/ELT tools and orchestration frameworks (e., Airflow, Prefect).
- Strong SQL skills and experience with large-scale data sets.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and data services
Did you find something suspicious?
Posted By
Kurakula Srilatha
Human Resources Associate at KloudPortal Technology Solutions Pvt Ltd
Last Active: 18 Nov 2025
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1525585
Interview Questions for you
View All