Posted on: 03/09/2025
Key Responsibilities :
- Design and develop data pipelines and ETL processes to ingest, process, and store large volumes of data.
- Implement and manage big data technologies such as Kafka, Dataflow, BigQuery, CloudSQL, PubSub
- Collaborate with stakeholders to understand data requirements and deliver high-quality data solutions.
- Monitor and troubleshoot data pipeline issues and implement solutions to prevent future occurrences.
Required Skills and Experience :
- Generally, we use Google Cloud Platform (GCP) for all software deployed at Wayfair.
- Data Storage and Processing
- BigQuery
- CloudSQL
- PostgreSQL
- DataProc
- Pub/Sub
Data modeling :
- Breaking the business requirements(KPIs) to data points.
- Building the scalable data model
ETL Tools :
- DBT
- SQL
- Data Orchestration and ETL
- Dataflow
- Cloud Composer
- Infrastructure and Deployment
- Kubernetes
- Helm
- Data Access and Management
- Looker
- Terraform
Ideal Business Domain Experience :
- Supply chain or warehousing experience
- The project is focused on building a normalized data layer which ingests information from multiple Warehouse
- Management Systems (WMS) and projects it for backoffice analysis.
Did you find something suspicious?
Posted By
Posted in
Data Engineering
Functional Area
Data Engineering
Job Code
1539570
Interview Questions for you
View All