HamburgerMenu
hirist

Job Description

Job Description :

We are seeking a highly skilled and experienced GCP Data Engineer to join Brainwork Techno solutions Pvt. Ltd.

Data Engineering Expertise :

The ideal candidate will have a strong background in Google Cloud Platform (GCP), particularly with BigQuery, and proficiency in Python and SQL.

Key Responsibilities :

- Design, develop, and implement robust and scalable data pipelines using Python, SQL, BigQuery, and Airflow or similar orchestration tools.


- Build and optimize data pipelines for efficient data ingestion, transformation, and loading.


- Automate data workflows and ensure data quality and reliability.

Data Warehousing & Data Marts :

- Design and build data marts to support business intelligence and reporting needs.


- Implement data warehousing best practices and ensure data integrity.


- Optimize data models and schemas for performance and scalability.

Reporting & Analytics :

- Build various business-critical reports to provide insights to stakeholders.


- Develop and maintain data visualizations and dashboards.


- Collaborate with business stakeholders to understand reporting requirements and deliver actionable insights.

Data Governance :

- Implement and enforce data governance policies and procedures.


- Ensure data security and compliance with relevant regulations.


- Manage data quality and metadata.

GCP Infrastructure & Migration :

- Participate in data migration projects, ensuring smooth transitions and data integrity.


- Optimize GCP resources for cost efficiency and performance.

Collaboration & Communication :

- Collaborate with various business stakeholders to understand data requirements and deliver solutions.


- Communicate effectively with team members and stakeholders, providing clear and concise updates.


- Work in a fast-paced environment and adapt to changing priorities.

Requirements :

- Strong proficiency in BigQuery.


- Experience with Cloud Storage.


- Knowledge of Cloud Composer (Airflow) or similar orchestration tools.


- Proficiency in Python and SQL.


- Understanding of data warehousing concepts.


- Experience with ETL/ELT processes.


- Knowledge of data modeling and data quality management.


- Excellent problem-solving and analytical skills.


- Strong communication and collaboration skills.


- Ability to work independently in a remote environment


info-icon

Did you find something suspicious?