HamburgerMenu
hirist

Job Description

Description :


Candidates for this position are preferred to be based in Bangalore, and will be expected to comply with their team's hybrid work schedule requirements.

Who We Are :

- Wayfair is on a path to be the worlds largest online destination for the home.

- We are the largest tech first platform in the home category.

- Our marketplace offers over 30 million products from 23,000 suppliers and we served 23 million customers in 2024 alone.

- Wayfair is investing heavily in building a world class advertising business and the Wayfair Advertising team owns features that form the core of our onsite advertising business.

- We ensure that millions of ads served are relevant to our customers by enabling advertisers and agencies to connect with the right customers at the right time with the right products.

- We are highly motivated, collaborative and fun loving with an entrepreneurial spirit and bias for action.

- With a broad mandate to experiment and innovate, we are growing at an unprecedented rate with a seemingly endless range of new opportunities.

What You'll Do :

- Drive the design, development, and launching of new data models, data pipelines, and data products focussed on Search and Recommendations.

- Helping teams push the boundaries of analytical insights, creating new product features using data, and powering machine learning models.

- Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization.

- Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure.

- Be a technical mentor to junior engineers.

We Are a Match Because You Have :

- Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience.

- 6+ years relevant work experience in the Data Engineering field with web scale data sets.

- Expertise with big data technologies & tools like Hadoop, Spark, Hive, Presto, Airflow etc.

- Experience in cloud platforms such as GCP using technologies like Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related technologies in AWS or Azure.

- Comfortable designing and implementing DW Architecture, OLAP technologies, and star/snowflake-schemas to enable self-service tooling.

- Expertise in at least one object-oriented or scripting language (Java, Scala, Python etc.) and SQL.

- Experience with real-time data streaming tools like Flink, Kafka, Beam or any similar tools.

- Experience with designing data models for traditional relational databases or big data stores.

- Strong understanding of algorithms, data structures, data architecture, and technical designs.

- Excellent communication and presentation skills, strong business acumen, critical thinking, and ability to work cross functionally through collaboration with engineering and business partners.

- Experience with domain-driven design, event modeling, and event sourcing is preferred.


info-icon

Did you find something suspicious?

Similar jobs that you might be interested in