HamburgerMenu
hirist

Job Description

Description :

Job Title : Data Architect GCP.

Location : Chennai/Hyderabad.

Experience : 10-15 years overall | 5+ years in GCP Architecture.

Budget : Open to discuss.

Notice Period : Immediate joiner/ Serving notice with less than 60 days/Notice is less than 60 days.

Responsibilities :

- Understand customers overall data platform, business and IT priorities and success measures to design data solutions that drive business value.

- Apply technical knowledge to architect solutions, create data platform and roadmaps on GCP cloud that meet business and IT needs.

- Estimation and outlining of the solutions needed to implement cloud native architecture and migration from on-prem systems.

- Executing the hands-on implementation of proposed solutions and providing technical guidance to teams during the solutions development and deployment phases.

- Ensure long term technical viability and optimization of production deployments, and lead/advise on migration and modernization using GCP native services.

- Work with prospective and existing customers to implement POCs/MVPs and guide through to deployment, operationalization, and troubleshooting.

- Identify and build technical collateral or technical assets for client consumption.

- Identify, communicate, and mitigate the assumptions, issues, and risks that occur throughout the project lifecycle.

- Ability to judge and strike a balance between what is strategically logical and what can be accomplished realistically.

- Assess and validate non-functional attributes and build solutions that exhibit high levels of performance, security, scalability, maintainability, and reliability.

Qualifications :

- Bachelors or Masters degree in Computer Science, Information Technology, or a related field.

- Proven experience as a Data Engineer with a focus on GCP data services.

- Strong proficiency in GCP services in data engineering and data warehousing including but not limited to BigQuery, Dataflow, Dataproc, Data Fusion, Cloud Storage, etc.

- Strong proficiency in ETL processes, SQL, and data integration techniques.

- Strong proficiency in data modelling and data warehousing concepts.

- Proficiency in architectural best practices in cloud around user management, data privacy, data security, performance and other non-functional requirements.

- Programming skills in languages such as PySpark, Python, Java, or Scala.

- Familiarity with building AI/ML models on cloud solutions built in GCP.

- Strong problem-solving and troubleshooting skills.

- Excellent communication and teamwork skills.

Preferred Skills :

- Solution architect and/or data engineer certifications from GCP.

- Experience with BFSI or Healthcare or Retail domain.

- Experience with data governance principles, data privacy and security.

- Experience with big data and distributed technologies like Hadoop, Hive, Kafka, etc.

- Experience with data visualization tools like Power BI or Tableau.


info-icon

Did you find something suspicious?