Posted on: 13/11/2025
Description :
Role Overview :
The Lead Data Engineer is a senior technical role requiring 4 to 10 years of total experience, with a strong specialization in database development, advanced scripting, and cloud integration.
The incumbent will be responsible for leading the design, development, and unit testing of complex data solutions using Oracle PL/SQL and Python, leveraging ETL tools like SnapLogic or Informatica, and demonstrating competency in AWS Cloud services.
This role demands a high degree of technical ownership, system analysis capabilities, and proficiency with modern DevOps and data orchestration tools.
Job Summary
We are seeking an experienced Lead Data Engineer to take ownership of end-to-end data engineering engagements. The ideal candidate will have deep, hands-on expertise in Oracle PL/SQL, Python, and SQL, paired with practical experience in cloud data environments (AWS) and ETL processes. This role requires technical leadership to drive the implementation of scalable and reliable data pipelines and enterprise data solutions.
Key Responsibilities and Deliverables
- Data Development & Analysis : Perform and lead system software analysis, development, and unit-testing for complex data pipelines, utilizing Oracle PL/SQL (minimum 4-6 years hands-on experience) and Python (minimum 2-3 years experience).
- Database Design & Optimization : Write and optimize advanced SQL queries and stored procedures, performing tuning for high-volume transactions across Oracle databases. Apply knowledge of Snowflake basics for data warehousing solutions.
- ETL/ELT Implementation : Develop and manage data integration solutions using ETL tools such as SnapLogic ETL or demonstrating knowledge of Informatica tool.
- Cloud & DevOps Integration : Utilize AWS Cloud competency (minimum 2+ years experience) for data solutions. Integrate data workflows using CI/CD tools like Jenkins/Bamboo and manage code using GitHub/JIRA.
- Workflow Automation : Implement and manage job scheduling for production workflows via tools like Control-M and write UNIX scripting for file handling and job control.
- Code Quality & Standards : Ensure adherence to robust coding standards and technical best practices, utilizing IDEs such as PyCharm or Visual Studio Code (VSCode) for efficient development.
- Technical Guidance : Provide technical leadership and guidance to ensure the delivered data solutions are scalable, reliable, and meet enterprise requirements.
Mandatory Skills & Qualifications :
- Database Core : Oracle PL/SQL (4-6 years hands-on experience) and SQL (4-6 years experience).
- Scripting : Python (2+ years hands-on experience) and proficiency with UNIX scripting.
- ETL Tools : Snaplogic ETL expertise is required, or strong knowledge/experience with Informatica.
- Cloud Competency : AWS Cloud competency (minimum 2+ years experience) is mandatory.
- Development Tools : Experience with version control and CI/CD tools such as GitHub, JIRA, Bamboo, and Jenkins.
- Experience : A minimum of 4-6 years of hands on system software analysis, development and unit-testing using Oracle PL/SQL and Python.
Preferred Skills :
- Job Scheduling : Experience with advanced job scheduling via Control-M.
- DevOps : Strong understanding of DevOps principles in a data environment.
- IDEs : Proficiency using PyCharm or Visual Studio Code (VSCode).
- Data Warehousing : Knowledge of Snowflake basics.
Did you find something suspicious?
Posted By
Soni Pandey
Senior Human Resources Manager at T and M Services Consulting Private Limited
Last Active: 13 Nov 2025
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1573607
Interview Questions for you
View All