This guest post was written by Hannah Alvarez, marketing associate for UserTesting, a leading provider of usability testing.
Most of us have a strong preference for the way in which we get feedback. It’s just part of our personalities. We like ideas or we like numbers. We like opinions or we like hard facts.
But when it comes to optimizing a website, you can’t ignore one type of information over the other. If you’re only looking at quantitative data (or listening to qualitative feedback), then you’re only getting part of the story behind user behavior.
Let’s say you’re really fascinated with data, and you do a great job of crunching the numbers. Maybe you love to use analytics platforms, and you know exactly what users are doing on your site. You know where they’re coming from, which pages they’re visiting, where they’re converting, and where they’re leaving your site.
When you run A/B tests, once you have statistically significant data, you know without a doubt which version is performing better.
But there’s something missing:
Why does one page have such a high bounce rate? Are users confused about the next step, or did they just become bored? Why did one CTA get more clicks than the other? Is is because of the color, or the copy? Or something else altogether?
On the other hand, maybe you’re deeply invested in qualitative feedback. You love talking to your customers. You pay a lot of attention to your Customer Support emails. Maybe you send out surveys and run user tests on a regular basis.
You’ve got a good picture of what’s going on in the minds of your customers.
But it can be hard to back up those opinions without numbers. You’ve probably encountered friction on your team when someone says, “Well, a handful of users didn’t understand our new headline, but our conversion rates are fine,” or “One customer sent negative feedback, but it was probably just a fluke.”
To get the complete view of what’s happening on your website, you need to combine your analytics data with your customer feedback. Here are 3 ways to use qualitative and quantitative data together to get more meaningful insights.
1. Get ideas for your next A/B test by asking your customers.
When you need to try out a new idea that will really move the needle, first find out what your customers want. You could use an on-site survey tool to ask users what they think is missing from your homepage, or why they abandoned their shopping cart.
You can also user test your site and pay attention to places where users stumble. If a user becomes confused when reading your copy, or if they say, “I don’t understand…” then you’ve just discovered a place to A/B test a new option.
2. Use data to validate your suspicions about customer preferences.
Imagine you’ve talked to a handful of customers who don’t like something on your site — for example, maybe there’s a significant usability problem that prevents mobile users from clicking your CTA button.
Take a deep dive into your analytics to compare conversion rates on that page for mobile devices vs. desktop computers. If you find that your mobile conversions are much lower, then you have the quantitative data to back up your belief that this issue needs to be fixed.
3. Allow customers to answer the “Why?” behind your data.
Sometimes, you’ll run an A/B test on two very different versions of a page, and one clear winner will emerge. For example, let’s say you A/B tested a landing page. The A version was green with a yellow button said, “Sign up for free!”; whereas the B version was blue with an orange button that said “Register now.”
If the B version won, was it because of the color of the page, the color of the button, or the copy in the CTA? You don’t know until you ask your users.
When you run a user test, maybe you’ll learn that users liked the color scheme from the B version, but they liked the copy in the A version. If you create a new version of your landing page that incorporates the best elements from both versions, then it will probably be more successful than either of the original versions.
Qualitative and quantitative data can be very powerful when used together. Each set can be used to complement the other, so think of them as two different perspectives that give you a bigger — and more accurate — picture of how users are behaving on your site.
Hannah is a Marketing Associate at UserTesting. As a former nonprofit professional, she’s dedicated to making the world – and the web – a better place. In her free time, she likes making things and going on adventures.