Posted on: 22/09/2025
The Senior Data Engineer will help design and implement a Google Cloud Platform (GCP) Data Lake, build scalable data pipelines, and ensure seamless access to data for business intelligence and data science tools.
They will support a wide range of projects while collaborating closely with management teams and business leaders.
The ideal candidate will have a strong understanding of data engineering principles, data warehousing concepts, and the ability to document technical knowledge into clear processes and procedures.
Responsibilities :
- Utilize GCP services including Big Query, Dataform, Cloud Functions, and Cloud Storage to optimize and manage data workflows, ensuring scalability, performance, and security.
- Collaborate closely with data analytics and data science teams to ensure data is properly prepared for consumption by various systems (e.g DOMO, Looker, Databricks).
- Implement best practices for data quality, consistency, and governance across all data pipelines and systems, ensuring compliance with internal and external standards.
- Continuously monitor, test, and optimize data workflows to improve performance, cost efficiency, and reliability.
- Maintain comprehensive technical documentation of data pipelines, systems, and architecture for knowledge sharing and future development.
Requirements :
- Bachelor's degree in Computer Science, Data Engineering, Data Science, or a related quantitative field (e.g Mathematics, Statistics, Engineering).
- 3+ years of experience using GCP Data Lake and Storage Services.
- Certifications in GCP are preferred (e.g Professional Cloud Developer, Professional Cloud Database Engineer).
- Advanced proficiency with SQL, with experience in writing complex queries, optimizing for performance, and using SQL in large-scale data processing workflows.
- Strong programming skills in Python, with additional experience in languages such as Java or Scala encouraged.
- Proven ability to build scalable data pipelines, automate workflows, and integrate APIs for efficient data ingestion.
- Proficient in Git and CI/CD practices, with experience automating testing and deployment of data systems.
- Experience with Looker Enterprise, including developing and maintaining LookML models to enable self-service analytics and data exploration.
- Strong data modeling skills, with experience designing scalable, maintainable models that support analytics, reporting, and business intelligence use cases across diverse teams.
- Expertise in infrastructure automation using Terraform, with experience scripting in Python and Java to provision and deploy cloud resources efficiently.
- Strong communication and collaboration skills, with a proven ability to work cross-functionally with teams such as data science, analytics, product, and business leadership to understand and meet their data needs.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1550528
Interview Questions for you
View All