HamburgerMenu
hirist

Job Description

Description :

Exp - 5 to 7 Years

Duration - 6 Months Contract with possible extension

Location - Bangalore

Looking for a professional in designing and deploying GenAI and data-driven solutions on Snowflake. The role requires expertise in Snowflake data engineering, agentic AI workflows (LangChain/LangGraph), and building scalable data pipelines and ML integrations using Snowpark.

Responsibilities :

- Design and optimize data models, schemas, and pipelines in Snowflake (raw, staging, curated layers).

- Prepare architecture diagrams for solutions on Snowflake.

- Build and maintain ELT/ETL workflows using Snowpipe, Streams, Tasks, and external orchestration tools.

- Integrate Agentic AI workflows (LangChain, LangGraph, etc.) with Snowflake as the primary data backbone.

- Enable AI agents to query, reason, and act using Snowflake datasets, including secure read/write operations.

- Implement batch and real-time data processing pipelines in Snowflake.

- Deploy and manage ML models via Snowpark for advanced analytics and AI-driven insights.

- Ensure data governance, RBAC, and compliance across all Snowflake environments.

- Monitor and optimize warehouse performance, cost efficiency, and query execution.

- Take ownership of modules, guide team members and interact with clients.

Required Skills :

- Strong expertise in Snowflake Data Cloud (SQL, schema design, performance tuning).

- Hands-on experience with Snowpipe, Streams, Tasks, Clustering, and Partitioning.

- Proficiency in Python and SQL for data engineering and AI workflows.

- Experience with LangChain, LangGraph, or other agentic AI frameworks.

- Knowledge of data governance, RBAC, and security best practices in Snowflake.

- Ability to build scalable ELT/ETL pipelines for structured and semi-structured data.

- Familiarity with Snowpark for deploying ML/AI models.

Note : Need immediate joiner only


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in