HamburgerMenu
hirist

Databricks Developer - ETL/PySpark

Jigya Software Services (P) Ltd
Multiple Locations
3 - 5 Years
star-icon
4white-divider15+ Reviews

Posted on: 05/10/2025

Job Description

Description :



Job Title : Senior Databricks Developer.

Location : Remote.

About Us :


The Role :


We are looking for a passionate Databricks Developer to architect and implement our next-generation data platform. You will be instrumental in building robust, scalable data pipelines, optimizing our cloud data warehouse, and creating powerful visualizations that drive key business decisions.



Key Responsibilities :



- Design, develop, and maintain efficient and reliable ETL/ELT pipelines using Databricks, PySpark, and Spark SQL.

- Build and optimize our cloud data warehouse (e.g., Snowflake, BigQuery, Synapse) for performance and scalability.

- Develop interactive and insightful dashboards and reports using Power BI/Tableau for various business units.

- Collaborate with data analysts and business stakeholders to translate requirements into technical solutions.

- Implement and manage Delta Lake and Lakehouse architecture for unified data analytics.

- Perform data modeling, performance tuning, and query optimization.

- Ensure best practices in data governance, security, and quality across all data assets.

- Work in an Agile environment, utilizing CI/CD and DevOps practices for data engineering.



Required Qualifications (Must-Haves) :



- 3+ years of experience as a Data Engineer, BI Developer, or similar role.

- Hands-on commercial experience with Databricks.

- Strong proficiency in PySpark and Spark SQL.

- Proven experience building dashboards with Power BI or Tableau.

- Hands-on experience with a cloud data warehouse (Snowflake, Azure Synapse, Google BigQuery, or AWS Redshift).

- Expert-level SQL skills and experience in query optimization.

- Solid understanding of ETL/ELT concepts and data modeling techniques (e.g., Star Schema).

- Excellent problem-solving abilities and strong communication skills.



Preferred Qualifications (Good-to-Have) :



- Experience with data orchestration tools like Azure Data Factory or Apache Airflow.

- Familiarity with DevOps practices and CI/CD pipelines for data projects.

- Knowledge of the DBT (data build tool) framework.

- Experience working in an Agile/Scrum development process.


info-icon

Did you find something suspicious?