Web analytics are powerful tools. But, say Digital officer Geraint Northam and UX officer Becca Edmeads, you’ve got to know how to use them.
Most of us use analytics unquestioningly. We don’t stop to think how accurate they might be, let alone what they might actually mean.
It’s important to recognise that analytics don’t give the full picture – they’re an indication of patterns of behaviour or trends. We need to educate ourselves on why the data might not be entirely accurate.
Common factors affecting analytics data
Some of the most common factors affecting the data include:
- People can opt out of analytics. Some visitors use ad blockers, and/or don’t allow cookies, and some have Javascript disabled. All this results in under-counting of website visitors.
- The same user may be visiting your website on different devices. If they look at your site on their mobile phone and then later on their laptop at home, they will most likely be treated as two different users.
- Web traffic may not come from real people, even if you’ve selected to exclude ‘bots’. (How much of the internet is fake?)
- Sampling is often involved, meaning numbers can never be exact. Google Analytics sometimes estimates its data based on a percentage of sessions. This article discusses sampling in more depth. You just need to be aware that it happens.
- Different analytics providers do things in different ways. For example, it’s been reported that Comscore’s numbers can be up to 50% lower than Google Analytics. Each analytics provider will return different data for the same page.
- Contextual factors. You should also be aware of the effect that the time of year, season, month or week may have on the data you’re viewing. For example in the higher education sector, you may get a skewed impression of pageviews if you only look at course pages in the weeks/days leading up to application deadline.
Don’t fixate on the numbers
It’s therefore critical that we place less importance on the actual numbers supplied by analytics, and focus instead on trends over time, and on using the data to gain insight on how people actually use our websites.
Analytics can show use patterns over time. The University Careers Service page, for example, shows a large peak in October and a smaller peak in February.
Tools such as heatmaps (we use Hotjar) show where users click on the page. This example from the Student Wellbeing page shows that most users want to get straight to a listing of services – ‘Support services’ at the top of the main menu and ‘Your support services’ at the bottom of the page. As a result of this insight we moved the second link to a much more prominent position on the page.
Interpreting the analytics
How you interpret the analytics depends on what your content is designed to do. Knowing the purpose of your web page or content is key to interpreting the data.
A high bounce rate can indicate a problem with a page, or it can suggest people are finding what they want and leaving. High numbers of return visits can mean people are going around in circles, or it can suggest people are purposefully re-visiting the content.
Web analytics give you a lot of information about what people are doing on your website. You’ll see broad trends and patterns of behaviour that can give you an idea of what’s working and what isn’t. You can build hypotheses to test.
Unfortunately, they won’t tell you why.
Qualitative research
To understand the ‘why’ behind your data you need to carry out qualitative research. Ask questions of the people who use your website and observe how they use it in real life.
Watch how people carry out tasks using your website and you’ll be able to see where it works for them and where it doesn’t.
We user tested the University Student Counselling website by approaching students and asking them for a few minutes of their time to try to complete some tasks such as booking a counselling session and finding an emergency contact number.
User testing just four or five students helped us to gain some very useful insight into the kinds of issues students might encounter when using the website. For example, we identified several cases of banner blindness where some students weren’t seeing the ‘Emergency help’ link in the right-hand column, despite it being housed in a bright red box.
Find out when they’re using it, the environment they’re in, their motivations and their abilities and you’ll begin to understand what’s influencing users’ actions on your site.
If you don’t have the resource for large scale surveys or in-depth testing sessions, keep it simple. Chances are you already know people who use your site. They could be your colleagues or students, professionals attending a conference or people at public events.