Posted on: 23/03/2026
Senior Data Engineer (7+ Years Experience) - GCP & BigQuery Specialist
Role Overview :
We are seeking a highly experienced Senior Data Engineer with over 15 years of expertise in designing, building, and managing scalable data solutions. The ideal candidate will have extensive hands-on experience with Google Cloud Platform (GCP) services, particularly BigQuery, and a strong background in data pipelines, ETL/ELT processes, and data architecture. This role requires a strategic thinker who can lead complex data engineering projects, mentor junior team members, and collaborate with cross-functional teams to deliver high-quality data solutions.
Key Responsibilities :
- Design, develop, and maintain scalable and efficient data pipelines and ETL/ELT workflows using GCP services.
- Architect and implement data warehouse solutions using BigQuery, ensuring optimal performance and cost efficiency.
- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights.
- Optimize and manage large-scale datasets, ensuring data quality, integrity, and security.
- Develop and enforce best practices for data governance, data modeling, and data lifecycle management.
- Implement streaming data solutions using tools like Dataflow, Pub/Sub, and Apache Beam.
- Monitor and troubleshoot data pipelines, ensuring high availability and reliability.
- Lead the migration of on-premise data systems to GCP, ensuring seamless integration and minimal downtime.
- Mentor and guide junior engineers, fostering a culture of continuous learning and innovation.
- Stay updated with the latest advancements in GCP and data engineering technologies, and recommend improvements to existing systems.
Required Skills and Qualifications :
- 7+ years of experience in data engineering, with a focus on cloud-based solutions.
- Expertise in GCP services, including but not limited to :
1. BigQuery
2. Dataflow
3. Pub/Sub
4. Cloud Storage
5. Cloud Composer (Airflow)
6. Cloud Functions
- Strong proficiency in SQL and experience with BigQuery SQL for complex queries and performance optimization.
- Hands-on experience with Python or Java for building data pipelines and automation.
- Deep understanding of data modeling, data warehousing, and schema design.
- Experience with streaming data processing and tools like Apache Beam or Kafka.
- Familiarity with CI/CD pipelines and version control systems like Git.
- Strong knowledge of data security and compliance standards (e.g., GDPR, HIPAA).
- Proven experience in migrating on-premise data systems to GCP.
- Excellent problem-solving skills and the ability to work in a fast-paced, dynamic environment.
- Strong communication and collaboration skills, with the ability to work effectively with both technical and non-technical stakeholders.
Preferred Qualifications :
- GCP certifications such as Professional Data Engineer or Professional Cloud Architect.
- Experience with machine learning workflows and integration with tools like Vertex AI.
- Familiarity with other cloud platforms (AWS, Azure) and hybrid cloud environments.
- Knowledge of Terraform or Infrastructure as Code (IaC) for managing GCP resources.
- Experience with data visualization tools like Looker, Tableau, or Power BI.
Did you find something suspicious?
Posted by
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1622654