HamburgerMenu
hirist

Data Engineer - Python/PySpark

D Square Consulting Services Pvt Ltd
5 - 7 Years
Bangalore

Posted on: 17/04/2026

Job Description

Description :

Role : Data Engineer -(Contract)

This is a contractual position for D Square Consulting Services Pvt Ltd

Experience : 5-7 Years

Location : Bangalore

Work mode : Hybrid

Notice period : Immediately to 30 days

Job Summary :

We are seeking a skilled Data Engineer to join our team. The ideal candidate will have strong hands-on experience with data platforms, including Snowflake/Databricks, PySpark, Streamlit, GitHub, and Python. You will design, build, and maintain scalable data pipelines to support analytics and reporting needs.

Must-Have skills :

- 5+ years of experience in data engineering or a related role.

- Hands-on experience with Snowflake/Databricks, PySpark, Streamlit, GitHub, and Python.

- Strong knowledge of ETL/ELT processes, data modeling, and data warehousing.

- Proficiency in SQL for querying and analysis.

- Familiarity with data visualization tools like Power BI.

- Excellent problem-solving skills and attention to detail.

- Strong communication and collaboration in team environments.

Good-to-have skills :

- Knowledge of data governance and security practices.

- Experience with data observability best practices.

- Proficiency in Scala or Java.

Roles & Responsibilities :

- Design, implement, and maintain robust data pipelines using Databricks for ingesting, transforming, and delivering data for analytics and reporting.

- Utilize Snowflake/Databricks for data storage, management, and optimization to ensure high performance and reliability.

- Develop and manage data transformation workflows using PySpark, focusing on data quality, consistency, and documentation.

- Collaborate with PowerBI developers and stakeholders to build interactive dashboards and data applications using Streamlit for actionable insights.

- Write efficient Python code to automate processes, cleanse data, and handle ETL workflows.

- Partner with cross-functional teams (analysts, business stakeholders) to gather requirements and deliver solutions.

- Implement monitoring for data quality and pipeline health; troubleshoot issues and optimize performance.

- Follow GitHub best practices for version control.

- Document data workflows, methodologies, and best practices for team knowledge sharing

Qualifications :

- Bachelors degree in Computer Science, Engineering, Data Science, or related field.

info-icon

Did you find something suspicious?

Similar jobs that you might be interested in