Posted on: 24/01/2026
Job Responsibilities :
- In-depth knowledge of Data Lake, Lakehouse, and Data Mesh architectures.
- Experience building any data platforms using Databricks / Delta Lake (on-prem) / Snowflake.
- Proficient in ingesting structured, semi-structured, and unstructured data.
- Strong hands-on experience with Python, PySpark, SQL and APIs for data ingestion and transformation.
- Experience with ETL / ELT pipelines, streaming (Kafka, Kinesis), and batch processing (Spark, Glue, DBT).
- Strong experience working with Parquet, JSON, CSV files, sensor data, and optimizing large-scale analytical dataset.
- Collaborate with team members to design and operationalize data lake solutions.
- Excellent problem-solving, communication, and team collaboration skills.
Job Requirements :
- B.Tech in Computer Science or Master of Computer Application.
- 6-8 years of relevant experience.
- Expertise in data modeling techniques including star/snowflake schemas and 3NF.
- Proficiency in Python programming.
- Hands-on experience with Kafka, Databricks, and Snowflakes.
- Strong understanding of data governance, lineage, data cataloging, and compliance (GDPR, HIPAA).
- Willingness to learn and adapt to new data engineering technologies6-8 years of relevant experience.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1605689