- We are seeking an experienced Data Lead to own and drive the design, development, and optimization of modern data platform using Snowflake & DBT.
- The candidate needs to have hands-on expertise with Snowflake, DBT, and PySpark, combined with strong data architecture, team leadership, and stakeholder management skills.
Key Responsibilities :
- Data Architecture & Strategy : Drive the design and implementation of the enterprise data platform, leveraging Snowflake as the core data infrastructure.
- Pipeline Development : Architect and deliver robust ETL/ELT pipelines in Snowflake and DBT to integrate data from diverse source systems.
- Full Development Lifecycle : Participate in all phases of solution delivery, including development, testing, deployment, and optimization.
- Quality Assurance : Perform code reviews, ensure adherence to engineering best practices, and continuously improve code quality and ETL performance.
- Issue Resolution : Diagnose, troubleshoot, and resolve complex ETL/ELT issues with efficiency and precision.
- Cross-Functional Collaboration : Work closely with product, engineering, analytics, and business teams to solve complex data challenges and align solutions with organizational goals.
Requirements :
Required Skills and Qualifications :
- At least 6 years of professional experience, including hands-on work as a Data Lead with strong expertise in Snowflake and DBT.
- Bachelors degree in Computer Science, Software Engineering, or a related discipline.
- Solid working knowledge of SQL, PySpark, and Python data libraries such as NumPy, SciPy, Pandas, Dask, and SQL, Alchemy.
- Strong proficiency in Snowflake data warehousing, including schema design and implementation.
Benefits :
- Competitive salary and performance-based bonuses.