Posted on: 09/08/2025
We are seeking a highly skilled Senior Big Data Engineer with 6+ years of relevant experience in Big Data, ETL/Data Warehousing, GCP, and Java.
The ideal candidate will design, develop, and optimize scalable, cloud-based real-time data pipelines and REST APIs using Java frameworks.
You will play a vital role in enabling data-driven decision-making by delivering innovative and reliable data solutions in collaboration with business and technical teams.
Key Responsibilities :
- Design, build, and maintain automated, scalable, and reliable Big Data pipelines on Google Cloud Platform (GCP) leveraging tools such as BigQuery, Dataproc, Dataflow, and Pub/Sub
- Develop and optimize ETL workflows for batch and real-time data processing using custom Python, Scala, or Cloud Data Functions
- Design and implement RESTful APIs and microservices using Java and Spring Boot frameworks, incorporating security standards such as JWT/OAuth
- Manage data storage solutions across relational databases including MySQL, PostgreSQL, MongoDB, and Cloud SQL
- Collaborate with cross-functional teams including data analysts, business stakeholders, and DevOps engineers to deliver data solutions aligned with business needs
- Implement CI/CD pipelines for data engineering workflows using tools like Cloud Build and GitHub Actions
- Monitor and troubleshoot production data pipelines with Cloud Monitoring and Cloud Logging to ensure high availability and performance
- Continuously explore and prototype emerging GCP technologies and third-party tools to enhance data infrastructure and analytics capabilities
Required Skills & Qualifications :
- 6+ years of hands-on experience in Big Data engineering, ETL, and Data Warehousing
- Strong expertise with GCP Big Data ecosystem : BigQuery, Dataproc, Dataflow, Pub/Sub, Cloud Storage, Compute Engine
- Proficient in Java programming and experience building REST APIs using Spring Boot framework
- Solid SQL skills and experience with complex queries and data transformations
- Familiarity with container orchestration and deployment technologies such as Kubernetes (GKE) is a plus
- Experience with CI/CD processes and DevOps tools for automated build, test, and deployment pipelines
- Understanding of backend system design, database operations, security, and API authentication
- Excellent problem-solving skills with a creative and innovative approach to data challenges
- Strong collaboration and communication skills to work effectively across teams and with non-technical stakeholders
- Bachelors degree in Computer Science, Engineering, or a related field
Preferred Skills :
- Proficiency with Python or Scala for data pipeline development
- Experience with monitoring and logging tools like Cloud Monitoring and Cloud Logging
- Knowledge of containerization and cloud infrastructure management
- Familiarity with message-driven architectures and event ingestion frameworks
- Agile development experience
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1526960
Interview Questions for you
View All