HamburgerMenu
hirist

Data Engineer - ETL/Python

Suitable
4 - 8 Years
Bangalore

Posted on: 11/03/2026

Job Description

Description :

Role Overview :

We are seeking a Data Engineer with strong expertise in modern data platforms to design and build scalable data pipelines and data lakehouse architectures.

The ideal candidate will have hands-on experience with Databricks, Apache Spark, and cloud-based data platforms, enabling reliable data ingestion, transformation, and analytics at scale.

You will collaborate closely with data science, analytics, and engineering teams to develop high-performance data solutions that power advanced analytics and machine learning workloads.

Key Responsibilities :

- Design, develop, and optimize ETL/ELT pipelines using Databricks and Apache Spark.

- Build and maintain scalable Delta Lake architectures following Bronze, Silver, and Gold data layer patterns.

- Develop data ingestion frameworks to collect data from APIs, databases, streaming platforms, and file-based sources.

- Optimize Spark workloads to improve performance, scalability, and cost efficiency.

- Implement data quality checks, validation pipelines, and monitoring frameworks to ensure data reliability.

- Collaborate with analytics and machine learning teams to enable data availability for reporting, analytics, and AI models.

- Ensure adherence to data governance, security, and compliance best practices.

- Automate data workflows and pipeline orchestration using Databricks Jobs and CI/CD processes.

Required Skills & Qualifications:

- Strong hands-on experience with Databricks and Apache Spark (PySpark or Scala).

- Proficiency in Python and SQL for data processing and transformation.

- Experience implementing Delta Lake and modern Lakehouse architectures.

- Hands-on experience with cloud platforms such as Amazon Web Services, Microsoft Azure, or Google Cloud.

- Strong understanding of data modeling, data warehousing, and big data architecture concepts.

- Experience with workflow orchestration tools such as Apache Airflow or Azure Data Factory.

- Familiarity with CI/CD pipelines and automated data workflows


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in