This article was originally published on the Fullstory blog, and is republished here with permission.
Knowing what happens on your website is hard. You can have all the user data in the world but no idea what to do with it, no process to manage it, and no way to filter the signal from the noise. Where do you even start? And once you begin, where do you go? What you need is a systemized approach — and the right tools.
When it comes to knowing, traditionally there have been two approaches you can take. The first is the quantitative approach. Quantitative analytics puts everything in terms of numbers. Number of users doing whatever — bounce rates, conversion rates, time on site … that kinda thing. Quantitative research is a squeaky clean way to crunch numbers and look at things at a high level.
The drawback to quantitative data is that much can be lost in translation. Important nuances are lost. (What determines a functional analytics tool is a subject for another day, but no matter what, the analytics tools you choose should serve your needs and budget.) And in the absence of information, the quantitative can play tricks on our pattern-recognizing brains. We can see things that just aren’t there — just like how we can see faces in objects.
The first few results for the Google Image Search for “faces in objects” do not disappoint.
To really know what happens on your website or web application, you need a way to get at that nuance. And that’s where the second approach comes in — the qualitative. Qualitative analysis focuses on the subjective qualities of your users — the nuanced actions that drive the numbers. It’s descriptive research. The qualitative approach is subjective, too. The user paused here. Raged there. They seem confused with this UI: what’s going on? Qualitative data is messy. But it is rich in insights.
The major problem with qualitative research methods is they occur at the individual level. Qualitatively researching every user who visits your site is impossible. You could never do it, nor would it be useful.
So what do you do? Use both.
The best analyses combine quantitative and qualitative research to create a flywheel of continuous product improvement. You use quantitative information to focus attention on the biggest problems (and opportunities) and then 4;zoom in4; to the level of the user through qualitative research to round out your understanding — and solve the problem.
This one-two research method spins out insights you can turn into actions. It’s a quantitative-qualitative flywheel that can be used for continuous improvement of your website or app.
Here’s how it works …
Quantitative data alerts you to problems. Start here.
The most basic web analytical tools available are quantitative — and free. It comes as no surprise that one of the first things website owners do on launching a new site is install Google Analytics (GA).
Quantitative analytical tools like GA do a reliable job providing high-level metrics — e.g. user info, page visits, events, conversion rates, bounce rates, time on site, etc. Most importantly, they allow you to see the status of different aspects of your site and help you focus on problems or opportunities in aggregate.
With a functional quantitative analytics tool up and running, you’re ready to identify issues and opportunities for improvement. This is the first step in building the quantitative-qualitative flywheel. (Also see our discussion on if user metrics make you heartless.)
The Quantitative-Qualitative Flywheel starts with identifying problems through quantitative analysis.
Once you have a quantitative tool tracking your customer data, you can quickly be dazzled by the illusion of insights. Fight the tendency to be hypnotized by the show.
Come at your newfound quantitative data powers with an objective. Otherwise, you can easily be overwhelmed.
Consider these changes in quantitative metrics that left teams struggling to understand the why behind the quantitative what:
- An increase in customer churn seen by Wistia couldn’t be explained or reproduced. Why were customers churning?
- Decreased usage of certain features, like the customer service software company Kayako saw on analyzing feature adoption. Why weren’t customers trying out new features?
- Increased page bounces, seen by the translation service SpanishDict on their home page. What was breaking at the moment of first engagement?
Continue reading %Analytics Bliss: Quantitative Data with Qualitative Research%