HamburgerMenu
hirist

MonoSpear Technologies - Senior Data Engineer - ETL/ELT Workflows

MonoSpear Technologies Pvt Ltd
Anywhere in India/Multiple Locations
6 - 10 Years

Posted on: 04/12/2025

Job Description

Description:

About the Role

We are looking for an experienced Senior Data Engineer to design, build, and optimize scalable ETL/ELT data pipelines, data models, and large-scale data processing systems.

You will play a key role in building reliable data foundations that power analytics, reporting, AI/ML, and product features.

The ideal candidate is strong in SQL, modern data engineering tools, cloud platforms, and best practices in data quality, performance, and governance.

Key Responsibilities:

- Design, develop, and maintain scalable ETL/ELT workflows for ingesting, transforming, and processing structured and unstructured data.

- Build highly efficient data pipelines that handle batch, streaming, or near-real-time data.

- Implement optimized code for data extraction, transformation, cleansing, enrichment, and loading.

- Design and implement data models (OLTP, OLAP, dimensional modeling, star/snowflake schemas).

- Develop and optimize data lake and data warehouse architectures.

- Work with analytics and product teams to define data structures for new features and dashboards.

- Build pipelines using modern data engineering technologies such as Spark, Airflow, dbt, Kafka, Snowflake, BigQuery, Redshift, Databricks, or similar tools (customizable based on your stack).

- Leverage cloud-native components for storage, compute, orchestration, and automation.

- Implement data validation, anomaly detection, and monitoring frameworks for pipeline reliability.

- Develop CI/CD pipelines for data workflows and ensure versioning, traceability, and reproducibility.

- Ensure compliance with data governance, security, privacy, and audit requirements.

- Automate workflows to improve reliability, reduce manual interventions, and enhance scalability.

- Work closely with data analysts, ML engineers, product managers, and business stakeholders to understand data needs and translate them into engineering solutions.

- Collaborate with software engineers and platform teams to integrate data pipelines into broader system architecture.

- Participate in architecture reviews, design discussions, and sprint planning.

- Optimize SQL queries, transformation logic, and compute workloads for performance and cost efficiency.

- Identify system bottlenecks and propose data engineering enhancements.

- Implement monitoring, alerting, and logging for pipeline stability and transparency.

- Provide technical guidance to junior data engineers.

- Drive best practices for coding standards, documentation, and data engineering workflows.

- Lead proof-of-concept efforts for new data technologies or architectural improvements.

Required Qualifications:

Technical Skills:

- 6+ years of hands-on experience in ETL/ELT pipeline development and large-scale data processing.

- Strong expertise in SQL (complex queries, optimization, indexing, query plans).

- Proficiency in a programming language such as Python, Scala, or Java.

- Experience with ETL/ELT orchestration tools (Airflow, dbt, Dataflow, Glue, ADF, Informatica, Talend, etc.)

- Hands-on experience with cloud platforms such as AWS, GCP, or Azure.

- Familiarity with data warehousing technologies (Snowflake, BigQuery, Redshift, Synapse).

- Experience with big data frameworks (Spark, Hadoop, Flink) is a strong plus.

- Good understanding of APIs, microservices, and integration patterns


info-icon

Did you find something suspicious?