HamburgerMenu
hirist

Reckonsys Tech Labs - Senior Data Engineer - Python/SQL

Posted on: 22/07/2025

Job Description

- IMMEDIATE JOINERS ONLY-

About the Role :

We are looking for a Senior Data Engineer who will play a key role in designing, optimizing, and maintaining ETL pipelines in a Python & SQL-based ecosystem.

This role requires strong architectural experience, ensuring that new data pipeline solutions are scalable, maintainable, and aligned with existing designs.

You will be responsible for evaluating current ETL designs, identifying improvements, and proposing well-structured solutions for new requirements.

You should have a deep understanding of data integration, performance optimization, and be comfortable making architectural decisions that impact data workflows.

Key Responsibilities :

- Design & optimize ETL pipelines using Python (Pandas) and SQL, ensuring scalability and maintainability.

- Analyse existing ETL architecture and propose improvements for efficiency, consistency, and performance.

- Develop new data pipelines while maintaining the integrity of existing workflows and data models.

- Refactor and modernize legacy ETL processes to fit evolving business needs.

- Maintain & improve automation scripts in a Linux environment.

- Ensure high code quality by following best practices in version control (Git) and deployment strategies.

- Collaborate with stakeholders to define business requirements and translate them into technical solutions.

- Work within an Agile environment, contributing to sprint planning, retrospectives, and backlog refinement.

Position Requirements :

Technical Expertise :

- 5+ years of experience in ETL development, with a strong focus on data pipeline architecture.

- Strong Python skills, particularly in Pandas for data manipulation and vectorized processing.

- 3+ years of experience in designing & maintaining relational databases (PostgreSQL or similar).

- Expertise in SQL query optimization, indexing, and schema design.

- Experience with Airflow (or similar orchestration tools) for scheduling and monitoring ETL workflows.

- Solid experience with Linux environments, including scripting and automation.

- Hands-on experience integrating multiple data formats (CSV, JSON, Parquet, YAML, etc.)

- Strong knowledge of data extraction and integration from diverse sources, including databases, SFTP, cloud storage, SharePoint and API.

- Git proficiency, preferably GitLab for version control and CI/CD.

Architectural & Problem-Solving Skills :

- Ability to evaluate existing ETL pipelines and propose enhancements for performance and scalability.

- Strong problem-solving mindset can analyse a business problem, understand the existing data infrastructure, and design the best technical approach.

- Experience with modular pipeline design to ensure reusable and efficient ETL components.

- Design and develop ETL/ELT pipelines with an emphasis on incremental data loading to optimize performance and reduce resource consumption.

- Expertise in data modelling (Kimball), schema design, and building dimension tables for data warehousing.


info-icon

Did you find something suspicious?