HamburgerMenu
hirist

DigiKey - Data Engineer - SQL/Python

DIGI-KEY ELECTRONICS & AUTOMATION TRADING PVT LTD
Multiple Locations
3 - 5 Years

Posted on: 12/01/2026

Job Description

About the Role :

We are seeking a highly experienced Data Engineer for the development and optimization of our modern cloud data platform infrastructure. This role is ideal for someone who thrives in a fast-paced environment, is passionate about data architecture, and has a deep understanding of data transformation, modeling, and orchestration using modern tools like dbt-core, Snowflake, and Python.

Key Responsibilities :

- Develop and implement scalable data pipelines using dbt-core, Python, and SQL to support analytics, reporting, and data science initiatives.

- Optimize data models in Snowflake to support efficient querying and storage.

- Development and maintenance of our data warehouse, ensuring data quality, governance, and performance.

- Collaborate with cross-functional teams including data analysts, data architects, data scientists, and business stakeholders to understand data needs and deliver robust solutions.

- Develop and support the best practices SOP for version control (Git), CI/CD pipelines, and data pipeline monitoring.

- Task Ownership and Self driven, work in a culture of technical excellence and continuous improvement.

- Independently execute POC to recommend new tools and technologies to enhance the data platform.

- Provide on-going support for the existing ELT/ETL processes and procedures.

- Other duties as assigned including but not limited to possible reallocation of efforts to other organizations per business need and management request.

Required Qualifications :

- Bachelor's degree in computer science or related field (16 years of formal education related to engineering)

- Around 5 years of experience in data engineering with minimum of 1+ years of experience in Snowflake and Python.

- Expert-level proficiency in SQL for data transformation and automation.

- Experience with dbt-core for data modeling and transformation.

- Strong hands-on experience in cloud platforms (Microsoft Azure) and cloud data platforms (Snowflake).

- Proficiency with Git and collaborative development workflows. Familiarity with Microsoft VSCode or similar IDEs. Knowledge of Azure DevOps or Gitlab development operations and job scheduling tools.

- Solid understanding of modern data warehousing architecture, dimensional modeling, ELT/ETL frameworks.

- Excellent communication skills and the ability to translate complex technical concepts to non-technical stakeholders.

- Proven expertise in designing and implementing batch and streaming data pipelines to support near real-time and large-scale data processing needs.

Preferred Qualifications :

- Experience working in a cloud-native environment (AWS, Azure, or GCP).

- Familiarity with data governance, security, and compliance standards.

- Prior experience with Apache Kafka (Confluent), DataOps.Live, Atlan.

- Hands-on experience with orchestration tools (e.g., Active Batch, Airflow, Prefect) is a plus.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in