HamburgerMenu
hirist

Job Description

Description :

Position : Data Engineer (Snowflake & Looker Specialist)

Experience : 5 - 8 Years of relevant experience

Location : India

Job Summary :

We are seeking an experienced Data Engineer with 58 years of expertise, specializing in advanced data warehousing on Snowflake and delivering sophisticated business intelligence via Looker. This role requires deep technical knowledge of Snowflake architecture, advanced SQL programming, and expert application of LookML to build scalable semantic layers and accurate dashboards. The engineer will be crucial in designing optimized data models, driving performance tuning efforts, and collaborating with business teams to translate strategic goals into measurable metrics and dimensions.

Job Description :

Snowflake Architecture and Data Modeling :

- Design, implement, and maintain scalable and resilient data warehouse solutions within the Snowflake cloud data platform.

- Apply deep expertise in Snowflake architecture, including virtual warehouse configuration, storage optimization, clustering keys, and micro-partitions, to ensure cost-efficiency and performance.

- Lead the design and implementation of optimized data modeling strategies (e.g., Dimensional Modeling, Data Vault) within Snowflake, ensuring data structures support rapid query performance for analytical consumption.

- Execute systematic Performance Tuning efforts on complex SQL queries and data loading processes (e.g., Snowpipe, COPY INTO), continuously monitoring and reducing query execution time and warehouse consumption.

- Utilize advanced SQL programming skills to develop, debug, and maintain complex stored procedures, user-defined functions (UDFs), and data transformation logic directly within the Snowflake environment.

Looker Development and Business Intelligence :

- Design, develop, and maintain the semantic layer in Looker using LookML, ensuring consistency, accuracy, and reusability of metrics and dimensions across the organization.

- Build and deploy professional, high-impact dashboards and visualizations in Looker that provide intuitive access to critical business insights and KPIs.

- Apply best practices for building scalable and accurate Looker solutions, including effective management of Explores, proper Git integration, and leveraging caching mechanisms.

- Collaborate closely with business and analytics teams to clearly define core metrics and dimensions that are directly aligned with organizational and departmental business goals.

- Manage user access, content management, and security settings within the Looker platform to ensure data governance and control.

Data Engineering Practices and Collaboration :

- Participate in the entire data lifecycle, from ingestion and transformation to consumption, ensuring data quality and integrity at every stage.

- Utilize strong analytical and problem-solving skills to troubleshoot complex data pipeline failures, model discrepancies, and performance bottlenecks that span the ETL/ELT process and the reporting layer.

- Work effectively with cross-functional teams to understand evolving data needs and deliver robust, production-ready data assets.

- Implement and enforce data governance standards, ensuring compliance and security within the Snowflake environment.

Required Skills & Qualifications :

- Experience : Mandatory 5 - 8 years of relevant experience in Data Engineering.

- Data Warehousing : Deep expertise in Snowflake architecture, Data modeling, and core data warehousing concepts.

- BI/LookML : Mandatory hands-on experience with LookML, building semantic layers, and designing production-grade Looker solutions and dashboards.

- Programming : Expert proficiency in Advanced SQL programming for complex query writing and data transformation.

- Core Skills : Strong analytical, problem-solving, and communication skills for effective collaboration with business stakeholders to define metrics.

- Education : Mandatory B.E / B. Tech / MCA.

Preferred Skills :

- ELT Tools : Experience with modern ELT orchestration tools (e.g., Airflow, Matillion, Fivetran) for automating data ingestion into Snowflake.

- Cloud Platforms : Experience with cloud services (AWS, Azure, or GCP) for data storage, networking, and security integration with Snowflake.

- Programming : Proficiency in Python for complex data preparation or scripting utilities.

- Certifications : Snowflake Certifications (e.g., SnowPro Core) or Looker Developer Certifications.

- Version Control : Experience with Git for version control of SQL scripts and LookML code.


info-icon

Did you find something suspicious?