Dan Tase
tasedanmarian@gmail.com
+44 7449 984 331
dantase.com
Just Eat
Help me choose
For new users, choosing what to order is one of the hardest things. Customers felt like they're dropped into an endless list of restaurants, each one with a huge alternative of dishes, and not having enough variables to decide.

This project turned into a long series of A/B tests, looking to identify small wins (that can easily be delivered across all platforms) and long-term gains (larger features that needed to be tackled by independent product teams).

Problem: Users have a hard time deciding on what to order and where to order from.
Desk research & focus groups
The first step was identifying pain points in the existing flow, and trying to understand more about the ordering process that our users have. We started by doing desk research, going through articles and university studies in order to find out more about the paradox of choice, menu psychology and so on.

Later on we organised focus groups with takeaway customers to go through the general ordering process & do some factor prioritisation exercises.


1. Do our users have a problem choosing what to order?
First, we needed to validate our assumption. Are users really feeling stuck when using Just Eat?

2. What would be the most effective way of helping them?
If that's the case, what ways could we improve the ordering experience?
Kick-off session
After the research phase we validated the existing problems, and we had some hints into various ways of solving them.

We set up a kick-off session to try and gather different ideas from around the company. In the end we were left with tons of ideas that we prioritised based of the problems they're solving. Some were quick fixes, some were quite radical.
Celebrate choice
There's this assumption that too much choice is bad. Reducing choice by hiding a number of restaurants is not ideal from the B2B side, risking to upset restaurant owners. Instead, we decided to bundle restaurants in curated & contextual lists: time of day, new restaurants, events, etc. This will make it easier to choose, having only 10-15 restaurants to compare in a single list.

This was shipped as an A/B test in some London postcodes. While results were positive (conversion went up & average time on page was shorter) this was quite hard to scale worldwide.
Upfront delivery time
As we found out during research, delivery time is the most important factor when ordering takeaway. Unlike Deliveroo or Uber Eats, Just Eat doesn't handle delivery and it's impossible to provide 200.000 external drivers with an unlimited data smartphone.

Due to those reasons, we created an algorithm that estimates the delivery time based on parameters like: distance, traffic, time of day, number of items ordered and how busy the restaurant is.

This started as an A/B test in Spain, and is currently being rolled out worldwide.
Social validation
Most users are looking for recommendations from their friends in every aspect of their life - from going on holiday to deciding what to wear. We tried to integrate this social validation into the core Just Eat experience.

This was designed and validated, but it's still on hold due to technical reasons.
Dish ratings
Once reaching the Menu page, it's extremely hard for users to decide which dish to buy. Things like dish images would be great, but it's impossible to achieve that for more than 80.000 restaurants (2.4 mil. photos on average) across 9 different countries.

Although we did quite a bit of photography A/B testing in the past, results weren't decisive. My definition of a good looking dish is most probably different from someone else's.

Instead, a simple iteration is to show popular dishes that have good reviews. This was shipped as an A/B test on the Italian marketplace, and the results showed an uplift in conversion and a lower average time on page.
Improving reviews
The reviews section is another important area of the platform, heavily influencing the user's final decision. In the initial focus groups we noticed that JUST EAT users don't really trust our reviews, and prefer going to Google or Trip Advisor to get a better understanding on the restaurant.

To increase the trust level we made it easier for users to leave reviews, we allowed restaurants to reply, and created 'Quality' tags based on the rating. All these can be later surfaced on the listing and menu pages.

This started as an A/B test, but is now rolled out worldwide.
Other quick wins
Besides all the big, high level features, we also shipped small, incremental changes to key pages in the ordering flow. As designers, we tend to focus on the big picture but forget to look at the small things that might be impactful.

Price Label:
Price is another key factor when deciding where to order from. We've added a price label to make it easier for users to understand how expensive a restaurant is.
Dish size:
Lots of users were complaining about the fact that they never know the size of a dish. This is extremely helpful when ordering for multiple people. Why order 2 portions of rice, when one is enough for two?
Kudos to the Just Eat team ❤️
Thanks to Simon Poole (Tech Manager), Erika Tamayo (PM), Erdeniz Hassan (UX Researcher) and all the great developers & data folks.
If you'd like to hear more about this project, or to talk about how I can help you improve your product through a test & learn approach, reach out at tasedanmarian@gmail.com