Crafted for Performance

Finding the balance between conversion optimisation and maintaining brand integrity.

R.M.Williams logo superimposed on a black and white photo of a man crafting a belt
Subtle wave graphic
Subtle wave graphic
Subtle wave graphic

Background

R.M.Williams came to our team with a clear goal in mind: help them learn about their customers. Find out what they like, how they prefer to shop and what guides their decisions.

Conversion improvement would come naturally as we use data to guide our design—but our purpose was primarily to learn.

  1. The Research

We immediately set about getting to know their customers. To do this, we performed a deep quantitative and qualitative analysis of the website to understand what customers are doing when shopping online and why.

The Quant

Our data analysis was comprehensive. We uncovered high- and low-performing journeys, different behaviours by device type, how new visitors shop differently to returning ones and much, much more.

The Qual

Talking to real people! We did both moderated and unmoderated testing to uncover usability issues, to understand what influenced customer's decisions and to really get to know who R.M.'s customers are.

I also methodically evaluated the site against a set of conversion criteria (a heuristic evaluation) to see what else might be causing friction on the site.

The last step was to look across the ecommerce landscape—competitors and other retailers to understand where R.M.Williams deviated from the norm, and to see if anyone had novel ways of solving problems already identified.

The Result

And the end of our research we knew how people behaved, why they made the decisions they did and what was preventing them from buying. We also had hundreds of data-validated experiment ideas ready to go.

After presenting our findings, we worked with the team at R.M.Williams to prioritise a list of experiments.

  1. Experiments and Outcomes

For the next two years we designed and executed experiments that helped guide the design direction of the R.M.Williams website. We helped improve navigation, filtering, size guides, size selection and the checkout process.

Our tests also informed where they should (or shouldn't) invest their money as we explored the impact of:

  • "Shop the look" tools

  • Video content

  • Buy-now-pay-later methods

  • In-store stock levels

As with every experimentation program, we had plenty of winning tests and plenty of losers. Some of the losing tests were among my favourites as they challenged our assumptions and taught us more about their customers than a winning test ever could.

While I can't share everything, below are some select examples with interesting results.

Example 1: Size selection simplified

We hypothesised that the existing size selector was adding friction for customers in 2 ways:

  1. They couldn’t quickly see if their size was in stock

  2. It was adding an additional step when adding an item to their cart

My solution was a simple format test: bringing the boot sizes out of the dropdown and onto the page, while also making it abundantly clear which sizes were in stock:

Current Experience
New design

This small change had a huge impact on the way people behaved—it more than doubled the number of customers selecting their size! We also saw a significant increase in the number of people adding to their cart.

A simple test with a big outcome and an important piece of knowledge gained: having sizes visible on the page adds a little nudge to get customers thinking about the next step.

Example 2: Checkout optimisation—or not!

This is one of my favourite tests we did on the site. Again, a very simple test with a big outcome—only this one lost!

We set out to remove what we thought was a redundant step in the checkout process:

Current Journey
  1. After adding an item to the cart, customers are shown the "minicart" containing all their items. They click "View bag and checkout".

  2. Next is the Shopping Bag page, where again they see all their items. They scroll down and click "Checkout".

  3. From here they can complete their purchase.

Step 2 seemed unneseccary—they've just seen all their items in the minicart, why do they need to see them again? Why not just go straight to the checkout?

After looking around at a number of other retailers, we noticed a common pattern: the main button in Step 1 would go straight to the checkout, and a small text link was provided for users who want that extra step in the Shopping Bag page.

Seemed straightforward enough. Let's test it!

New Journey

The results genuinely surprised us. Nearly a 7% decrease in the number of completed purchases.

Can you imagine if a change like this made it onto the site without being tested? A 7% decrease in sales and nobody would have noticed.

So what happened? Well, the data doesn't tell us why—but we can make a pretty good estimation. We believe this drop occurred because R.M.Williams customers are cautious buyers and use the Shopping Bag page to double (and triple) check everything is right. Spending $700 on a pair of boots isn't something you do flippantly.

Removing that step removed a sense of safety. That feeling of uncertainty was enough to cause customers to put off the decision until later and abandon their purchase.

Can you spot the difference?

Look at the buttons—in Variation 1, we have a clear call to action, "View flights". In Variation 2 we put the link on the price. One less element meant less visual noise and that the height of each card could be reduced. We know attention is going to be on the price and this pattern is used in other parts of the Qantas website. People are smart—they can recognise a link when they see one, right? Seemed like a safe bet.

I had high hopes for this test. It was relatively simple to execute and it gave the page a welcome face lift. Unfortunately it didn't go so well.

We ran the test for two weeks with a sample size of 105k users (35k each for Control, Variation 1 and Variation 2.).

Variation 1 saw no meaningful change in our metrics. The number of people clicking on a deal saw a slight drop (with low statistical significance) and we saw no impact on the rest of the funnel.

Variation 2 lost. Badly. A -22% drop in the number of people clicking on a deal, ultimately leading to a statistically significant decrease in flight purchases.

Thankfully, not all is lost! We learned a few things from this test:

  1. Photos of the destination don't appear to impact people (at least, at this point in their journey). If anything, they just get in their way.

  2. Buttons are really important. Not having one caused a -22% decrease in people clicking on a deal.

  3. We also measured the number of people clicking that "View fare rules" link. It's a T&Cs link that's required by Legal but we suspected it might be confusing people. The strong visual hierarchy between that and the CTA in Variation 1 meant significantly fewer people were clicking on the wrong spot. Likewise, the lack of hierarchy in Variation 2 meant more people were clicking in the wrong spot.

So there we are. A losing test with some important learnings to take into future designs—and a new experiment to try! Instead of adding photos, can we improve the hierarchy between "View flights" and "View fare rules"? We did just that—and that test won.

Conclusion

Over two years I worked closely with the team at R.M.Williams to iteratively improve their site and learn about their customers. They trusted me to deliver designs that maintained their premium aesthetic and brand integrity.

Looking back over the 30 or so tests we ran, I can confidently say we learned an enormous amount and helped shape the direction of their website. The data from our experiments helped prevent costly mistakes and guided them towards a best-in-class ecommerce experience.

Subtle wave graphic