HamburgerMenu
hirist

Job Description

About the Job :

We are looking for an experienced Data & Analytics Engineer with strong hands-on expertise in Snowflake, GCP BigQuery, and Data Warehousing concepts.

The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines, ETL processes, and analytical solutions to support business intelligence and data-driven decision-making.

Key Responsibilities :

- Design and implement robust ETL pipelines using Informatica (IVS version) and BDT tools.

- Develop, optimize, and manage data warehouses and data models to ensure efficient data storage and retrieval.

- Work extensively on Snowflake and GCP BigQuery for data integration, transformation, and analytics.

- Collaborate with business teams to translate requirements into scalable data solutions.

- Create and maintain dashboards and reports using Domo (preferred) or equivalent BI tools such as Power BI or Looker.

- Ensure data quality, governance, and consistency across systems.

- Continuously improve system performance and data processing workflows.

Required Skills & Experience :


- Snowflake - Mandatory (hands-on experience)

- ETL Tools : Informatica (IVS version), BDT

- GCP BigQuery - Mandatory (hands-on experience)

- Data Modelling & Data Warehousing - Mandatory (hands-on experience)

- Analytics Tools : Domo (high priority) | Alternatives : Power BI, Looker

- Strong SQL and data transformation skills

- Excellent communication, logical thinking, and technical acumen

Preferred Qualifications :


- Experience working in cloud-based data environments (GCP, AWS, or Azure).

- Knowledge of automation and data orchestration tools (Airflow, Dataflow, etc.

- Ability to work independently in a fast-paced, collaborative setup

info-icon

Did you find something suspicious?