HamburgerMenu
hirist

Job Description

Position : Data Engineering Lead.

Qualification & Experience :

- Bachelors Degree or equivalent.

- 5- 7 years of overall experience.

- 5+ years of strong hands-on expertise in Azure Data Factory (ADF) or equivalent and SQL.

Key Skills/ Primary Technical Skills :

- Demonstrated expertise in SQL and modern data pipeline orchestration tools such as Azure Data Factory, or equivalent.

- Proven experience in designing and implementing scalable data pipelines to support batch and streaming data processing.

- Strong capability in data ingestion, transformation, and optimization across diverse data sources and formats.

- Experience in integrating data from APIs (REST, OAuth, SOAP, etc.) into data lakes or analytical warehouses.

- Proficiency in working with semi-structured data formats including Parquet, JSON, and Avro.

- Solid understanding of data warehousing principles, including secure data model design, query performance optimization, and analytics workload management.

- Hands-on experience with cloud-based data storage solutions such as Data Lake, or equivalent.

- Strong foundation in data modeling, architecture, and performance tuning following industry best practices.

- Experience in managing large-scale datasets and delivering analytical insights to support business decision-making.

Secondary Technical Skills :

- Working knowledge of DevOps practices and implementation of CI/CD pipelines for data workflows.

- Ability to design and build programs to extract data from external systems using APIs and standard authentication mechanisms.

- Familiarity with NoSQL databases (e.g., DocumentDB, GraphDB) is preferred.

- Exposure to data governance, data quality, and lineage frameworks is advantageous.

Behavioral Competencies :

- Excellent communication and collaboration skills to work effectively with cross-functional teams.

- Strong sense of accountability and ownership with a proactive approach to problem-solving.

- Ability to operate in a fast-paced, collaborative environment while maintaining high standards of quality.

Responsibilities :

- Design, develop, and maintain robust, scalable data pipelines and ETL/ELT processes across cloud and hybrid environments.

- Build efficient data ingestion and transformation workflows for structured and semi-structured datasets.

- Collaborate closely with business, analytics, and engineering teams to deliver high-quality, analytics-ready data solutions.

- Optimize data workflows for performance, security, and cost efficiency, ensuring adherence to organizational standards.

- Contribute to the architecture, standardization, and continuous improvement of the enterprise data platform.


info-icon

Did you find something suspicious?