HamburgerMenu
hirist

Inxite Out - Senior Data Engineer II

INXITE-OUT PRIVATE LIMITED
Any Location
5 - 8 Years
star-icon
4.3white-divider16+ Reviews

Posted on: 05/11/2025

Job Description

Description :


We are looking for an experienced Senior Data Engineer (Level II) with strong expertise in SQL, PySpark, ETL development, Data Lake architectures, and the Azure Data Platform.

The ideal candidate will have hands-on experience in building and maintaining scalable data pipelines, cloud-based data solutions, and cross-functional data workflows.

The role requires technical depth, ownership, and the ability to collaborate closely with engineering, analytics, and business teams.


Key Responsibilities :


- Design, build, and manage end-to-end data pipelines using Python and PySpark.

- Develop ETL/ELT workflows to ingest, clean, transform, and expose data for analytical and operational use cases.

- Ensure data pipelines are scalable, optimized, and designed for performance and reliability.

- Work extensively with Azure Data Factory, Azure Data Lake Storage, Azure Blob Storage, and Azure Synapse Analytics.

- Architect and manage data solutions following Azure security and performance best practices.

- Implement and enhance data lake zones, metadata-driven pipelines, and monitoring dashboards.

- Write advanced SQL queries for data extraction, transformation, and reporting.

- Optimize queries for performance and cost efficiency in cloud data environments.

- Design logical, physical, and dimensional data models aligned with reporting needs.



Collaboration & Communication :

- Work closely with Data Scientists, BI/Analytics teams, and business stakeholders to understand data requirements.

- Communicate technical decisions, architecture patterns, and data flows clearly and effectively.

- Participate in sprint planning, estimation, documentation, and delivery reviews.


Software Engineering & DevOps Practices :

- Use Git for version control and maintain clean, organized code repositories.

- Work within agile development practices, using Jira or similar PM tools.

- Contribute to CI/CD pipelines for data deployments and workflow automation.


Requirements :

- Bachelors degree in B.Sc / BCA / B.Tech / B.E (any specialization).

- Excellent communication skills both written and verbal.

- Stable high-speed internet connection is required for remote collaboration.

- Azure Certifications (e.g., DP-203) will be an added advantage.

- Exposure to Delta Lake or Medallion Architecture.

- Familiarity with Airflow or orchestration frameworks.

- Experience working with distributed systems and large-scale datasets


info-icon

Did you find something suspicious?