HamburgerMenu
hirist

Job Description

We are looking for an experienced ETL Lead Developer to design, build, and manage scalable data integration solutions that support business intelligence, analytics, and enterprise applications. The ideal candidate will have strong expertise in ETL tools, data warehousing concepts, and cloud/on-premise data platforms. As a lead, you will guide a team of developers, collaborate with business stakeholders, and ensure the delivery of high-quality data pipelines that drive data-driven decision-making.


Key Responsibilities :


- Lead the design, development, and implementation of ETL workflows, data pipelines, and integration solutions.


- Work closely with business analysts, data architects, and stakeholders to gather requirements and translate them into technical specifications.


- Oversee and mentor a team of ETL developers, ensuring adherence to coding standards, best practices, and project timelines.


- Optimize ETL processes for scalability, performance, and reliability.


- Manage data integration across multiple systems, databases, and platforms (cloud/on-premise).


- Perform data profiling, validation, cleansing, and transformation to ensure data quality.


- Collaborate with QA teams for unit testing, system testing, and user acceptance testing (UAT).


- Troubleshoot and resolve complex ETL and data integration issues.


- Implement monitoring, logging, and error-handling frameworks for production ETL processes.


- Ensure compliance with data governance, security, and regulatory standards.


- Participate in Agile ceremonies (sprint planning, stand-ups, retrospectives) and contribute to continuous

improvement initiatives.


Required Technical Skills :


- Strong experience with ETL tools such as Informatica, Talend, DataStage, SSIS, or equivalent.


- Solid understanding of data warehousing concepts (star schema, snowflake schema, slowly changing dimensions, fact/dimension tables).


- Proficiency in SQL, PL/SQL, and performance tuning of complex queries.


- Hands-on experience with relational databases (Oracle, SQL Server, PostgreSQL, MySQL, etc.) and data lakes.


- Exposure to cloud platforms (AWS, Azure, GCP) and their data services (e.g., Redshift, Snowflake, BigQuery, Azure Synapse).


- Knowledge of scripting languages (Python, Shell, PowerShell) for automation.


- Familiarity with data modeling techniques and ER diagrams.


- Experience with version control systems (Git) and CI/CD pipelines for ETL deployments.


- Understanding of data governance, lineage, and metadata management.


Good to Have (Optional Skills) :


- Experience with big data technologies (Hadoop, Spark, Kafka).


- Knowledge of API integration (REST, SOAP).


- Hands-on experience with containerization (Docker, Kubernetes).


- Familiarity with reporting/BI tools (Tableau, Power BI, Qlik).

info-icon

Did you find something suspicious?