HamburgerMenu
hirist

Technical Lead - Python/PySpark

NAM Info Inc
Bangalore
7 - 10 Years
star-icon
4.2white-divider9+ Reviews

Posted on: 17/07/2025

Job Description

Job Title : Tech Lead


Location : Bangalore, India

Employment Type : Full-time with Nam Info


Experience : 7+ years


Notice period : Immediate or within 15 days


Work Mode : Work from Office (WFO)


Work Timings : 12 PM to 9.00 PM


Company address : NAM Info Pvt Ltd, 29/2B-01, 1st Floor, K.R. Road, Banashankari 2nd Stage,

Bangalore 560070.


Interview Process : 3 Rounds


Mode of Interview : 1st Virtual / 2nd & 3rd F2F


Job Overview :


We are looking for an experienced and dynamic Tech Lead to join our team in Bangalore. This role is for a highly skilled technical leader who will be responsible for overseeing and guiding teams on various data engineering projects. The ideal candidate will have expertise in cloud technologies, data warehousing, and advanced data processing tools and frameworks. You will work closely with cross-functional teams to design, develop, and deliver high-quality solutions.


Required Skills and Experience :


- 7+ years of experience in software engineering and data engineering roles.

- Strong experience with AWS cloud platforms.

- Hands-on experience in Data Warehousing and cloud-based data architectures.

- Proficient in Python, Spark, and PySpark for data processing.

- Experience working with Snowflake / Databricks and other big data tools.

- Strong command of SQL and experience working with any RDBMS (Relational Database

Management System).

- Expertise in using ETL tools for data pipeline development.

- Excellent problem-solving skills and the ability to troubleshoot complex issues.


Education :


- Bachelors or Masters degree in Computer Science, Engineering, or related field.


What You'll Do :

- Technical Leadership : Provide technical leadership and guidance to data engineering teams, ensuring the successful execution of projects.


- Solution Design : Work closely with cross-functional teams to design and architect high-quality, scalable data solutions on cloud platforms.


- Data Processing : Lead the development of robust data pipelines using Python, Spark, and PySpark for efficient data processing.


- Data Warehousing : Apply strong hands-on experience in Data Warehousing and cloud-based data architectures.


- Cloud Expertise : Leverage expertise with AWS cloud platforms to design and implement cloud-native data solutions.


- Big Data Tools : Work with modern big data tools and platforms such as Snowflake / Databricks.


- Database Proficiency : Utilize a strong command of SQL and experience with any RDBMS

(Relational Database Management System).


- ETL Development : Drive data pipeline development using ETL tools.


- Troubleshooting : Apply excellent problem-solving skills to troubleshoot and resolve complex technical issues within data systems.


info-icon

Did you find something suspicious?