HamburgerMenu
hirist

SquareShift - Senior Data Engineer - Snowflake DB

SQUARESHIFT TECHNOLOGIES PRIVATE LIMITED
7 - 10 Years
Chennai

Posted on: 01/04/2026

Job Description

Description :

Job Title : Data Engineer

Location : Chennai, India (On-site)

Employment Type : Full-time

Experience : 7+ Years

Role Overview :


We are seeking a highly skilled Data Engineer to design, build, and maintain scalable data infrastructure and pipelines. The ideal candidate will have strong expertise in modern data engineering practices, with hands-on experience in Snowflake, cloud platforms, and large-scale data processing.

In this role, you will work closely with data analysts, BI teams, and product stakeholders to deliver high-quality, reliable, and optimized datasets that power business insights and decision-making.

Key Responsibilities :


- Design, develop, and maintain scalable data pipelines and ETL/ELT workflows.

- Build and optimize data models and data warehouse architectures.

- Work extensively with Snowflake for data storage, transformation, and performance tuning.

- Collaborate with BI, analytics, and product teams to deliver reliable and clean datasets.

- Ensure data quality, integrity, and governance across multiple data sources.

- Optimize query performance and cost efficiency within Snowflake environments.

- Integrate and manage data from multiple sources including APIs, databases, and third-party systems.

- Implement and maintain data security, access controls, and compliance practices.

- Monitor and troubleshoot data pipelines to ensure high availability and reliability.

Required Skills & Qualifications :


- 7+ years of experience in Data Engineering or a related field.

- Strong hands-on expertise with Snowflake (data modeling, optimization, performance tuning).

- Advanced proficiency in SQL and experience handling large-scale datasets.

- Hands-on experience with ETL/ELT tools such as Airflow, Informatica, dbt, or similar platforms.

- Experience working with cloud platforms such as AWS, Azure, or GCP.

- Strong understanding of data warehousing concepts including star schema and snowflake schema.

- Proficiency in Python or Scala for data processing.

- Experience with data pipeline orchestration and scheduling tools.

Preferred Skills :


- Experience with big data technologies such as Apache Spark or Hadoop.

- Familiarity with streaming platforms like Kafka or Kinesis.

- Experience implementing CI/CD pipelines and DevOps best practices.

- Exposure to data governance frameworks and security best practices.

Key Competencies :


- Strong analytical and problem-solving skills.

- Ability to collaborate effectively with cross-functional teams.

- Excellent communication and stakeholder management skills.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in