HamburgerMenu
hirist

Big Data Engineer - Spark/Scala

Strategic HR Solutions
Multiple Locations
6 - 10 Years

Posted on: 31/08/2025

Job Description

Spark Scala Developer

Location : Bengaluru, Mumbai

Employment Type : Full-time

What Were Looking For


Were hiring a Spark Scala Developer who has real-world experience working in Big Data environments, both on-prem and/or in the cloud. You should know how to write production-grade Spark applications, fine-tune performance, and work fluently with Scalas functional style. Experience with cloud platforms and modern data tools like Snowflake or Databricks is a strong plus.

Your Responsibilities :


- Design and develop scalable data pipelines using Apache Spark and Scala

- Optimize and troubleshoot Spark jobs for performance (e.g. memory management, shuffles, skew)

- Work with massive datasets in on-prem Hadoop clusters or cloud platforms like AWS/GCP/Azure

- Write clean, modular Scala code using functional programming principles

- Collaborate with data teams to integrate with platforms like Snowflake, Databricks, or data lakes

- Ensure code quality, documentation, and CI/CD practices are followed

Must-Have Skills :

- 3+ years of experience with Apache Spark in Scala

- Deep understanding of Spark internalsDAG, stages, tasks, caching, joins, partitioning

- Hands-on experience with performance tuning in production Spark jobs

- Proficiency in Scala functional programming (e.g. immutability, higher-order functions, Option/Either)

- Proficiency in SQL

- Experience with any major cloud platform: AWS, Azure, or GCP

Nice-to-Have :

- Worked with Databricks, Snowflake, or Delta Lake

- Exposure to data pipeline tools like Airflow, Kafka, Glue, or BigQuery

- Familiarity with CI/CD pipelines and Git-based workflows

- Comfortable with SQL optimization and schema design in distributed environments


info-icon

Did you find something suspicious?