HamburgerMenu
hirist

VAYUZ Technologies - Lead Data Engineer

VAYUZ Technologies
Anywhere in India/Multiple Locations
3 - 7 Years

Posted on: 17/01/2026

Job Description

Description :


Key Responsibilities :


- Lead and manage teams of data engineers, analysts, and developers to deliver high-quality, production-ready data platforms.


- Own the design, development, and maintenance of data pipelines, data lakes, and data warehouses.


- Ensure data availability, performance, reliability, scalability, and cost efficiency across platforms.


- Translate business requirements into robust technical architectures and implementation plans.


- Define and enforce data governance, security, and compliance standards.


- Drive continuous improvement of data engineering processes, frameworks, and tooling.


- Mentor, guide, and develop team members through technical leadership and career planning.


- Manage project delivery, resource planning, timelines, and execution milestones.


- Partner with BI, Product, IT, and Cloud teams to enable organization-wide data-driven decision making.


- Evaluate and adopt emerging technologies and best practices in cloud and data engineering ecosystems.


Required Skills & Qualifications :


- Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related fields.


- 11+ years of experience in data engineering or related domains, including 3+ years in a lead or people-management role.


- Strong expertise in SQL, data modeling, and ETL/ELT design.


- Hands-on experience with Azure cloud platforms, including Databricks and Snowflake.


- Proficiency with modern data frameworks and tools such as Apache Spark, Kafka, Airflow.


- Solid knowledge of data governance, security, and compliance frameworks (GDPR, HIPAA, etc.)


- Proven leadership, communication, and stakeholder management capabilities.


- Demonstrated ability to manage complex priorities in high-growth, fast-paced environments.


- Strong hands-on experience in orchestrating Databricks pipelines, automated workflows, and job scheduling.


- Expertise with Databricks Unity Catalog, Hive Metastore, and building Databricks dashboards for monitoring and visualization.


- Working experience with Azure Functions for serverless and event-driven data workflows.


- Proven implementation of Medallion Architecture (BronzeSilverGold) for scalable ingestion, transformation, and delivery.


- Experience building end-to-end data pipelines on Azure + Databricks environments.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in