HamburgerMenu
hirist

Job Description

Description :

Principal Engineer, Enterprise Data Platform ( 15 -20 years ) (Data warehouse , data form , Data Modelling and Architecture) and (GCP / GBQ ) & Cloud Platform ( Azure/ AWS)

Work Location: Bangalore Hallmark Office (IBS)- LOC_WDT_IBS

Job Description :

ESSENTIAL DUTIES AND RESPONSIBILITIES :

- Provide strategic direction and technical oversight for a significant greenfield initiative focused on rebuilding and modernizing our data warehouse infrastructure.

- Possessing deep knowledge of data engineering principles, tools, and technologies.

- Providing direction and guidance on data architecture, solution design, and implementation strategies.

- Designing and implementing data pipelines, data warehouse, and data lake using modern technology components like Spark, Iceberg, Delta Lake, Scala, Python etc.

- Implementing and enforcing data governance policies, including data security controls and privacy regulations.

- Define and maintain end to end data flows, Data Lineage, Data Catalog for various data marts

- Be a liaison between solution architects, BSAs and data engineers to ensure compliance to standards of Data integrations, data management and review the data solutions

- Maintain inventory and roadmap of data assets across the Data Platform

- Ensure best practices for data democratization architecture & data virtualization

- Offer insight, guidance, prototype, and direction on the usage of latest trends and technical capabilities for Enterprise Data Management

- Stay updated with the latest industry trends and best practices, sharing knowledge and encourage team to continuously improve their skills

Qualifications :

REQUIRED :

- Bachelors degree or higher in Computer Science or Engineering or related field

- Minimum 15+ years of experience working with Data Management including Data Integration/ETL (Informatica, Talend, Fivetran etc.

), Data Warehouse/Lakes (Oracle, Snowflake, Databricks, Google Big Query, Redshift), Master Data Management, Data Quality, Data Modeling, Data Analytics/BI, Data Enrichment, Security and Governance.

- Minimum of 7+ years of experience focused specifically on Delivery or Solution Architecture of large complex data management programs

- Strong data modeling expertise in supporting both classic and modern data Lakehouse architectures.

- Strong understanding and experience building ELT/ETL solutions

- Experience on various data Quality tools like Informatica Data Quality, Atlan, Collibra, etc.

- Demonstrable experience working with data structures coming from variety of ERP, CRM and other data sources

- Experience working with at least one major cloud data platforms like AWS, Azure, Google, etc.

- Experience working with at least one modern Lakehouse platforms like Databricks, Snowflake, etc.

- Experience working with Tableau, Power BI, or other Data visualization tools.

- Knowledge of advanced analytics GenAI platforms is a plus.

- Develop Entity Relationship Diagrams using and data modeling tools.

- Guide technical team to tunes complex solutions, monitor system performance, and provide recommendations and means for improvement

- Prototype new technologies and implement innovative solutions to enable teams to consume and understand data faster

- Responsible for metadata management of data domain, inclusive of data definitions, data catalog, data lineage and documentation of data flow for critical processes, Sox compliance

- Partner with Data Governance analyst and Business Data Stewards

- Maintain in-depth understanding of business functions, processes, and relationships as it relates to data

SKILLS:
- A solid understanding of data modeling, modern data architecture, master data management, data profiling and data cleansing techniques.

- Experience with REST APIs, GraphQL / OData and Event-Driven Architecture Expertise in SQL with modern data warehousing solutions (e.

, GBQ, Fabric, Snowflake, Databricks).

- Proficiency in Python and open-source data processing tools (Spark, Parquet, dbt, dataform, etc.

)

- Familiarity with essential DevOps practices, including version control, automated CI/CD, etc.

- Ability to analyze, communicate and solve complex problems

- - - - - - -


info-icon

Did you find something suspicious?