Posted on: 24/07/2025
Job Description :
Eucloid is looking for a senior Data Engineer with hands-on expertise in Databricks to join our Data Platform team supporting various business applications. The ideal candidate will support the development of data infrastructure on Databricks for our clients by participating in activities which may include starting from upstream and downstream technology selection to designing and building different components.
Candidate will also be involved in projects like integrating data from various sources, managing big data pipelines that are easily accessible with optimized performance of the overall ecosystem. The ideal candidate is an experienced data wrangler who will support our software developers, database architects, and data analysts on business initiatives. You must be self-directed and comfortable supporting the data needs of cross-functional teams, systems, and technical solutions.
Qualifications :
- B.Tech/BS degree in Computer Science, Computer Engineering, Statistics, or other Engineering disciplines
- Min. 5 years of Professional work Experience, with 1+ years of hands-on experience with Databricks
- Highly proficient in SQL & Data model (conceptual and logical) concepts
- Highly proficient with Python & Spark (3+ year)
- Knowledge of distributed computing and cloud databases like Redshift, BigQuery etc.
- 2+ years of Hands-on experience with one of the top cloud platforms - AWS/GCP/Azure.
- Experience with Modern Data stack tools like Airflow, Terraform, dbt, Glue, Dataproc, etc.
- Exposure to Hadoop & Shell scripting is a plus
- Min 2 years, Databricks 1 year desirable, Python & Spark 1+ years, SQL only, any cloud exp 1+ year
Responsibilities :
- Design, implementation, and improvement of processes & automation of Data infrastructure
- Tuning of Data pipelines for reliability & performance
- Building tools and scripts to develop, monitor, and troubleshoot ETLs
- Perform scalability, latency, and availability tests on a regular basis.
- Perform code reviews and QA data imported by various processes.
- Investigate, analyze, correct, and document reported data defects.
- Create and maintain technical specification documentation.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Big Data / Data Warehousing / ETL
Job Code
1518576
Interview Questions for you
View All