Lee Cooper – VP of Professional Services at UserZoom explains how highly targeted user experience research is the key to unlocking the most effective approach.
Any business with a customer-facing website – and these days, that increasingly means ‘any business’ – needs to think carefully about user experience on that website. User experience can be the difference between an individual bouncing straight back off that site or staying to browse – and the difference between browsing and buying. It has a direct and powerful relationship to the business bottom line.
But then business leaders have known this for a long time. The interesting point of discussion is not ‘should I pay attention to online user experience?’ but ‘how can I enhance that online user experience as much as possible?’
When it comes to answering that question, there are two main areas to consider. First, there is the actual design of your website and the mechanics of how it operates – the navigation, and journeys users take. Second, there is the website content – how well does it speak to what your customers want, and need? Is it appealing and easy to engage with? The first area can be optimised very effectively using quantitative data, while covering the second comprehensively benefits from a more qualitative approach.
Combining quantitative and qualitative
At this stage, many digital strategists will be nodding along, knowing that they quite rightly harness a rich variety of user data in order to tweak and tailor their website in line with genuine customer behaviours. Heat maps, information architecture testing, task-based testing – all these are valuable research methodologies, commonly applied to both live and test websites, enabling digital professionals to see how their sites work with a real user base, and make adjustments accordingly. Multivariate testing, whereby different user groups are presented with different versions of a website and their behaviours are compared to identify the most effective designs, is particularly useful, and used to powerful effect by a wide range of organisations. Even the relatively low-level use of tools such as a Google analytics can fit into this category. All these methods help in terms of tracking metrics pertaining to user behaviour, and data correlation can then be used to inform the digital development strategy.
However, it is possible for such testing to be enhanced even further. The tests above produce valuable quantitative data, and result in convincing metrics that can be used to select the most effective pathway forward. But they typically don’t produce the rich qualitative data that explains why users have made particular choices. This more descriptive type of data we referenced above is still missing. While qualitative methods, such as questionnaires and forms, are commonly used to provide this kind of feedback it is much less common for the two methods to be used in tandem.
A supercharged user experience strategy, then, combines both large samples of quantitative data demonstrating what users have done, with qualitative data explaining why they have done so. It’s simple enough to implement; a feedback form or comments box, to be filled in on exiting a multivariate test, is all it takes. The use of the two testing methods in conjunction with each other is the key to a truly rounded set of data, and developers can then delve far further into the motivations for certain behaviours and often gain insight into their customers reactions to their site that they simply couldn’t glean from numbers and metrics alone.
In turn, these insights can be used to break down pre-existing assumptions regarding user behaviours and desires, which is particularly valuable when long-held assumptions or even regulations have framed the way a site is developed. It’s very easy, at the development stage of a website and beyond, to assume that all customers want X or behave like Y because of Z – and quantitative testing may not provide the deep insights necessary to challenge these preconceptions.
Redefining the approach to energy comparisons: a MoneySupeMarket case study
MoneySuperMarket discovered this to their benefit when they deployed a multivariate testing solution with the help of UserZoom.
MoneySuperMarket is accredited by Ofgem, the UK energy market regulator, as a price comparison site for energy products. At the time, Ofgem required comparison sites to show users all of the deals available – even if some of those deals weren’t applicable to the user in question. The existing assumption – the preconception in terms of site design – was that it was fairer for all customers to be presented with all the information and filter through the options independently.
With Ofgem’s agreement, MoneySuperMarket ran a multivariate test over several weeks, where some of the website designs only presented users with actionable deals. Crucially, UserZoom was used to capture customer feedback at the point of exiting the site, providing MoneySuperMarket with a rich and detailed set of qualitative and quantitative data gathered over several weeks. The results were overwhelming – users much preferred to only see the deals that they could actually proceed with, finding this cleaner, simpler and easier to manage.
The data collected and presented to Ofgem was very rich and compelling, and this year the Competitions and Markets Authority ultimately ruled in favour of comparison sites only showing deals that customers can switch to. Preconceptions of what was best for website users have been shattered – all thanks to a deeper, more detailed approach to collecting user experience data.
The customer is always right
This has long been a mantra for multiple aspects of business, including digital marketing and website design. But while a customer in a restaurant can easily be asked, face to face, whether everything is acceptable with their meal, these conversations have often been neglected by digital developers. However, with the right technology, this data can be captured online – and is the key to unlocking an enhanced user experience.