Posted on: 24/03/2026
Key Responsibilities
Snowflake Development :
- Configure and manage Snowflake accounts, roles, virtual warehouses, and secure data storage.
- Design and optimize schemas, tables, and views for efficient querying and analytics.
- Implement ingestion pipelines using Snowpipe, ETL/ELT tools, or custom scripts for diverse data sources.
- Monitor compute and storage costs and implement cost optimization strategies.
DBT (Data Build Tool) :
- Develop modular SQL transformations using dbt for data modeling and analytics.
- Implement data quality checks and testing frameworks within dbt.
- Manage dbt deployments, version control, and CI/CD integration.
Apache Airflow :
- Design and orchestrate workflows using Airflow DAGs for ETL and data processing tasks.
- Implement error handling, logging, and alerting mechanisms for Airflow jobs.
- Integrate Airflow with authentication systems (e.g., Okta) and other tools like ADF or Ataccama for data governance.
BI and Conversational AI Integration :
- Integrate Microsoft Copilot and/or Snowflake Cortex into BI workflows to enable natural language queries and AI-driven insights.
- Develop and maintain AI-powered dashboards and reporting solutions that support dynamic, conversational interactions.
Data Quality & Governance :
- Apply data validation rules and ensure compliance with enterprise standards.
- Collaborate with data governance teams to enforce naming conventions and security policies.
Collaboration & Documentation :
- Work closely with data analysts, business stakeholders, and engineering teams to deliver reliable data solutions.
- Document architecture, workflows, and operational procedures.
Required Skills :
- Minimum of 4 years professional experience working as a Data Engineer, Data Architect, or similar role for Financial Services firm(s) (buy-side preferably)
- Strong proficiency in Snowflake (data warehousing concepts, performance tuning, security).
- Hands-on experience with dbt for data transformation and testing.
- Expertise in Apache Airflow for workflow orchestration.
- Knowledge of cloud data ecosystems (Azure, Snowflake) and integration with Microsoft Copilot, Cortex, or similar conversational AI platforms.
- Strong proficiency in Power BI, DAX, and semantic modeling.
- Solid understanding of SQL, Python, and CI/CD (GITLAB) practices.
- Familiarity with cloud platforms (Azure preferred) and data integration tools.
- Knowledge of data governance and quality frameworks.
Preferred Qualifications :
- Experience with Azure Data Factory, Ataccama, or similar tools.
- Exposure to Private Credit projects leveraging Geneva portfolio management platform.
- Experience with AI-driven reporting and Copilot Studio for building custom agents.
- Understanding of Microsoft Fabric and interoperability with Snowflake for unified analytics.
- Strong problem-solving and communication skills
Did you find something suspicious?
Posted by
LIGHTING ROD TECH PRIVATE LIMITED
Sr. HR Person at LIGHTING ROD TECH PRIVATE LIMITED
Last Active: 13 Apr 2026
Posted in
Data Engineering
Functional Area
Data Analysis / Business Analysis
Job Code
1622967