Back in May last year, we wrote about our plans for the Labs product effectiveness evaluation. Since then we’ve been busy gathering feedback and crunching numbers on Doc Ready, In Hand, Moodbug, HeadMeds and Madly in Love. We’ll be posting the evaluation report later in April but in the meantime, through a series of four blogs, we’ll give some highlights from the evaluation, including feedback received from young people.
Data Analytics
Today, we’re writing about the analytics data generated by users across the five products and what they tell us about how people engaged with each one.
All Labs product teams captured their data through common analytics software. Doc Ready, HeadMeds and Madly in Love used Google Analytics and the mobile apps In Hand and Moodbug used Flurry. We analysed their usage data up to the end of October 2014, and since then some of their analytics have changed. However, the data captured gives a good snapshot of product usage in the first 6-12 months.
The Numbers Are Good
Thousands of people have visited each product and numbers continue to increase. Each teams used different ways to market their products and we can see a direct relationship between these marketing activities and numbers. One good example is Madly in Love’s Spotify advertising and playlist competition in February 2014. This generated approximately 6,000 visitors. Last autumn Headmeds ran a Google Adwords campaign and are seeing large increases in the numbers of people using the site month by month – the site now gets around 35,000 visits a month – approximately a third of the number received by YoungMinds’ main website. A different example is that of Doc Ready whose mention on Reddit resulted in over 20,000 visits.
Do Users Engage?
While the headline number of ‘visitor hits’ includes everyone who lands on the site even for a few seconds this alone does not tell us if they actually used the products for enough time to engage with the content, for example, to compile a checklist in Doc Ready or share a mood with Moodbug. So we asked the products teams to calculate the minimum time a use would need to spend on their app or site to have a ‘meaningful interaction’.
For Doc Ready and Madly in Love approximately 25% of users visited long enough to register an engagement.
MoodBug (50%) and In Hand (60%) both attracted an even higher ratio of engagers.
HeadMeds generated two different kinds of engagement. 32% of visitors had what we class as a ‘light engagement’ experience, skimming the site in seconds to find a specific piece of information while 68% of visitors registered as deep engagers, spending over a minute searching for a medication or watching a video story.
Do Users and Visitors Return?
As you might expect, people are more likely to use a mobile app more than once compared to the websites. For Doc Ready, Headmeds and Madly in Love, the return rate was 16%. But we have to remember that the purpose of these products means that people may get what they need on their first visit and have no need to come back.
One of the ideas behind apps is that they meet an ongoing need and that using them becomes a habit Overall, 89% of Moodbug users had more than one go and 75% of In Hand users did similarly. However, the proportion of users returning consistently beyond the second go was approximately 16% across both apps.
Meeting Their Purpose?
Analytics can also give us some limited insight into if the products are delivering as intended. For Headmeds a key purpose is to provide information that can’t be accessed from other sources, such as Can I have sex on this medication? and how medications interact with alcohol and other drugs. These specific information pages are some of the most frequently viewed at Headmeds. For Madly in Love, over time the playlists have become less popular and been replaced by the advice and shared experiences pages.
What Don’t We Know From Analytics?
Analytics tools are only helpful if they are set up and used in the right way. Through our data analysis we discovered some glitches and mistakes had been made in setting set up Google Analytics and Flurry on some of the products. This meant we couldn’t always retrieve the data needed. It’s a good idea to think about what you want to know in advance and then to test this out and iron out any bugs at the beginning.
The biggest limitation of analytics is that however they are set up they can’t tell you how a product is impacting on a user’s mental wellbeing. To gauge any impact we need to learn directly from those people who have used them. So, during September and October, we embedded surveys into Doc Ready and In Hand to help us find this out. Look out for my next blog which will reveal what we discovered about young people using In Hand…