Posted on: 18/08/2025
Job Title : DBT Developer - Capital Market
Location : Bangalore / Pune
Work Mode : Hybrid (3 days WFO - Tuesday, Wednesday, Thursday)
Shift Timing : 12 : 30 PM - 9 : 30 PM
Role Overview
We are seeking a skilled DBT Developer with expertise in data transformation and modeling to join our cross-functional Agile team. The ideal candidate will design and maintain scalable data transformation pipelines using dbt (Data Build Tool) within a Snowflake environment. This role offers the opportunity to work at the intersection of engineering, analytics, and business, helping to deliver clean, reliable, and analytics-ready data models that power applications, reporting, and strategic insights.
Key Responsibilities :
- Design, build, and maintain scalable, modular dbt models and transformation pipelines.
- Write efficient SQL queries to transform raw data into curated datasets in Snowflake.
- Collaborate with backend/frontend engineers, product managers, and analysts to develop analytics-ready data models that support business and application needs.
- Partner with stakeholders to gather data requirements and translate them into reliable, maintainable solutions.
- Enforce data quality through testing, documentation, and version control in dbt.
- Integrate dbt workflows into CI/CD pipelines and support automated deployments.
- Monitor and optimize data pipelines for performance, reliability, and scalability.
- Actively participate in Agile ceremonies (stand-ups, sprint planning, retrospectives) and manage tasks using Jira.
Required Qualifications & Skills :
- 3-5 years of experience in data engineering / analytics engineering with a strong focus on SQL-based transformations.
- Hands-on, production-level experience using dbt as a primary development tool.
- Strong proficiency in SQL and knowledge of data modeling principles (star/snowflake schema).
- Proven experience with Snowflake or other cloud-based data warehouses.
- Familiarity with Git-based workflows for version control.
- Strong communication and collaboration skills to work effectively across engineering and business teams.
- Prior experience working in Agile/Scrum environments with Jira.
Nice-to-Have Skills :
- Experience integrating dbt into CI/CD pipelines.
- Exposure to cloud platforms (AWS preferred).
- Familiarity with Docker and container-based development.
- Knowledge of data orchestration tools (Airflow, Prefect, Dagster, etc.).
- Understanding of downstream BI/Analytics tools (Looker, Tableau, Power BI).
- Basic Python scripting for data pipeline integration/ingestion.
Preferred Experience :
- Proven track record of building and managing production-scale dbt projects.
- Experience working with cross-functional teams involving developers, analysts, and product managers.
- Strong sense of ownership, documentation, and data quality assurance.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1531177
Interview Questions for you
View All