HamburgerMenu
hirist

Vontier - Senior Data Engineer - Snowflake DB

Vontier
5 - 8 Years
Bangalore

Posted on: 29/04/2026

Job Description

About Vontier :


Vontier is a global industrial technology company focused on smarter transportation and mobility. Our portfolio of five operating companies is united by a powerful purpose: mobilizing the future to create a better world. Our pioneering solutions advance safety, security, efficiency, and environmental compliance worldwide. With more people on the planet, rapid urbanization, and increasing vehicle traffic on our roads, it's time to pave the way for the evolution of transportation. Vontier is ready to build that future!


We are a global company focused on transportation and mobility. Our portfolio includes industry-leading expertise in smart cities, mobility infrastructure, retail and commercial fueling, fleet management, and vehicle maintenance and repair. These innovative companies are leading the way to smarter transportation for a growing, connected world.


Gilbarco Veeder-Root, a Vontier company, represents the leading brands of solutions and technologies that provide convenience, control, and environmental integrity for retail fueling and adjacent markets. In 2002, the Gilbarco and Veeder-Root companies combined into one marketing brand, with distinctive and complementary business lines, services, and sales capabilities.


Gilbarco is a leading global supplier of fuel dispensing equipment, fully integrated point of sale systems for the global petroleum marketplace with sales, manufacturing, research, development, and service locations in North and South America, Europe, Asia, the Pacific Rim and Australia.


Job Description :


Job Title : Senior Data Engineer (Snowflake)


Location : Bengaluru, India (Hybrid 3 Days WFO mandatory)


Employment Type : Full-Time


INTRODUCTION and WHAT YOU WILL DO (Job Responsibilities) :


What You'll Do :


- Design, develop, and maintain scalable data pipelines processing EV charging transaction data from multiple on-premises customer databases


- Build and optimize dbt models using dbt Core and dbt Cloud, implementing dimensional modeling patterns and incremental loading strategies across DEV, UAT, and PROD environments


- Orchestrate complex data workflows using Apache Airflow (Cloud Composer), managing dependencies and scheduling across multiple data sources


- Manage data ingestion workflows from source systems (Postgres on AWS, MySQL) to GCP and Snowflake, utilizing Snowpipes for automated CSV processing


- Develop and maintain multi-tenant data architectures in Snowflake, ensuring proper data isolation and role-based access control (RBAC)


- Configure and manage Snowflake objects including stages, storage integrations, and external tables for seamless data movement


- Design and implement incremental export processes to deliver data from Snowflake to external tenant storage systems


- Optimize Snowflake warehouse performance, manage compute resources, and implement data governance best practices at the account admin level


- Collaborate with BI teams to ensure data models support Tableau dashboards and analytics requirements


- Implement CI/CD pipelines for automated dbt deployments with proper testing and rollback capabilities


- Monitor data quality, troubleshoot pipeline issues, and maintain comprehensive documentation


WHO YOU ARE (Qualifications) :


Required :


- 4+ years of experience in data engineering roles


- Strong proficiency with dbt (Core and Cloud) including incremental models, macros, and testing frameworks


- Production experience with Snowflake data warehouse including performance optimization, role management, and security features


- Experience working as Snowflake Account Admin, managing users, manage roles based on RBAC, warehouses, resource monitors, and account-level configurations


- Strong Python programming skills for data pipeline development, automation, and scripting


- Apache Airflow experience for workflow orchestration and scheduling


- Hands-on experience with cloud platforms: GCP and AWS (S3, RDS, Cloud Storage, Secret Manager, compute services)


- Experience configuring data extraction from PostgreSQL databases on AWS


- Expertise in defining and managing Snowflake stages and storage integrations for data loading and unloading


- Experience designing incremental export solutions to deliver data from Snowflake to external tenant environments


- Strong SQL skills and understanding of dimensional modeling (star/snowflake schemas, SCD patterns)


- Experience with version control systems (Git/GitHub/Bitbucket) and CI/CD practices


- Understanding of multi-tenant data architecture principles and data isolation strategies


- 3+ years of hands-on experience with Tableau for dashboard development and data visualization


- Familiarity with ELT/ETL design patterns and data pipeline orchestration


Nice to Have :


- Experience implementing Change Data Capture (CDC) solutions into Snowflake


- Kafka experience for real-time data streaming and event-driven architectures


- Experience migrating from Jenkins/Bash to Airflow-based orchestration


- Experience in the EV charging, automotive, or energy sectors


- Understanding of data security compliance and MFA implementation


- Background in API integration and external data source management


Technical Environment :


- Data Warehouse : Snowflake (clustering, time travel, streams, tasks, stages, storage integrations)


- Transformation : dbt Core, dbt Cloud, dbt CLI


- Orchestration : Apache Airflow, Cloud Composer (GCP)


- Cloud Platforms : Google Cloud Platform (GCP), Amazon Web Services (AWS)


- Programming : Python, SQL


- Version Control : Bitbucket/GitHub


- BI Tools : Tableau


- Operating Systems : macOS, Windows


- Source Systems : MySQL, PostgreSQL (AWS RDS)


- Data Ingestion : Snowpipes, CSV processing, external stages


- Streaming (Nice to Have) : Apache Kafka

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in