HamburgerMenu
hirist

Technical Lead - Python/ETL

PibyThree
Multiple Locations
3 - 5 Years
star-icon
4.6white-divider32+ Reviews

Posted on: 14/10/2025

Job Description

Description :

About the job :

Key Responsibilities :

- Lead the design, development, and optimization of end-to-end data pipelines and ETL workflows.

- Architect data integration solutions leveraging Snowflake, Informatica PowerCenter, and Teradata.

- Collaborate with business and technical teams to gather requirements and translate them into technical designs.

- Drive performance tuning, error handling, and automation across ETL and data warehouse layers.

- Provide technical leadership to developers, mentor junior engineers, and enforce best practices.

- Ensure data quality, consistency, and security across data environments.

- Support migration and modernization initiatives from legacy systems to cloud-based data platforms.

- Participate in Agile/Scrum ceremonies, sprint planning, and code reviews.

Required Skills & Experience :

Core Expertise (Must-Have) :

- Snowflake (3+ years) : Hands-on experience with SnowSQL, Time Travel, cloning, query profiling and optimization, and secure Data Sharing. Able to design schemas and implement performance-tuned solutions.

- Informatica PowerCenter (3+ years) : Proficient with Designer, Workflow Manager/Monitor, mappings and transformations, session/workflow tuning, error handling and deployments.

- Teradata (3+ years) : Strong SQL skills including BTEQ scripting, stored procedures, performance-tuned joins, indexing/collect stats, and utilities such as FastLoad, MultiLoad, TPT.

- SQL & Data Modeling : Demonstrated ability to design normalized and dimensional models (3NF, Star, Snowflake) and write complex, performance-oriented SQL for analytics.

Required Supporting Skills :

- Shell scripting (Bash/Ksh) : Practical experience automating ETL jobs, log handling, and job orchestration (typically 2+ years).

- Python for data engineering : Proficient in writing production-quality Python scripts for ETL and data processing (typically 2+ years).


- Familiar with commonly used libraries (e.g., pandas, SQLAlchemy), exception handling, basic unit testing, and performance considerations.

Preferred / Nice-to-Have :

- Experience with CI/CD and version control : Git (branching strategies) and building/maintaining pipelines (Jenkins, GitLab CI, Azure DevOps).

- Familiarity with cloud data platforms and migrations (AWS/GCP/Azure) and cloud-native storage/compute services.

- Experience with orchestration tools (Apache Airflow, Control-M) and monitoring/alerting solutions.

- Prior experience working in Agile/Scrum teams, performing code reviews, and mentoring team members.

Skills : etl,powercenter,bteq,teradata,snowflake


info-icon

Did you find something suspicious?