It’s been a busy few months and it shows by the lack of content here. So I decided to cover a topic I spoke about at WordCamp Milano. So, here goes: a primer on measuring your business and answering questions with data.
I need to start with a confession here: I never liked numbers, to begin with. I remember how frustrated I would be, age 15, sitting for hours in front of a math problem, and completely unable to grasp the complexity of it. Then my brother would come and be like “But what’s the issue, this is sooooo easy!” and tears of anger would start pouring down my face.
If at that time somebody came and told me “One day you’ll be looking at huge tables filled with data and loving the hell out of it”, I’d assume they had completely lost their mind. But here I am, some years later, looking at huge tables filled with data and loving the hell out of it. Irony has its ways.
I love data. The complexity of it is intriguing. If you do data right, you need to strip your biases and preliminary ideas and look at the objective truth. Even when it’s not pretty to look at, even when it’s shattering your carefully woven hypotheses apart.
Watch the presentation
In this post, you’ll find the full overview of my talk in text form with lots of details, but if you’re more of a visual type of person, you may prefer watching the recording of the talk from WordCamp Milano:
This is not really a question worth having, but I wanted to cover it anyway. The answer is not simply “because you need data to make better decisions” – it goes deeper than that.
First off, data ensures everyone is on the same page. To set KPIs (key performance indicators) for your quarterly goals, you first have to agree what those goals are. Fickle measurement criteria just mean your team hasn’t spent enough time on setting your goals. Then once those goals are set, KPIs ensure reaching them is an objective thing.
And even when you don’t reach your goal, you can see what didn’t work and data can help you see what you need to improve. Digging down through the numbers can show what step of the process didn’t work.
Qualitative data helps you kill anecdotal evidence. Don’t get me wrong, it’s still important to use other sources of insight. But looking at the specific case of your lead developer’s girlfriend may lead you on to create solutions that are not applicable to 90% of your clients.
And finally, data helps you be deliberate about the work you do. New and exciting ideas abide – it’s so easy to fall for something new and shiny and just start executing. Data helps you think if this new initiative will bring impact. You set your goals for a reason, right? So why leave them behind and get a new thing underway, if it doesn’t bring you closer to the end result?
One thing that I want to clear out of the way is what types of data I find useful and why. Whenever I talk about numbers, there’s the misconception that I’m saying “Numbers are the only source of truth, leave all else!” That’s not true. We always need to combine quantitative and qualitative data to see the full picture.
Using numbers helps you do incremental testing and optimization, reaching a local maximum of sorts. It can help you improve bit by bit, but it doesn’t inform the decisions that are way out there. For that, you need to try something completely different. Here’s how that looks:
In other words, you can use quantitative results to do lots of optimizations. I’m taking a page from David Darmanin’s book and telling you testing is for confirming winning ideas, not finding new ones. To do the latter, you need customer feedback and qualitative data. As David put it during his talk at the Growth Marketing Summit, there’s an evolution of testing that goes from numbers and optimization to feedback and new solutions:
Benchmarks vs historical data
The second topic that always comes in those discussions is benchmarking. I have a profound hatred for benchmarks. They can tell you only one of three things:
- You suck and you should completely abandon what you’re doing right now. Go to an organic cheese farm and spend the rest of your life away from Internet devices.
- You’re just like the rest – no special snowflake here.
- You are amazing and you can lay down, watch Netflix and never touch anything on your site, it’s perfect!
Essentially, you’d only get the result, but not the Why. What’s more, benchmarks are wildly inaccurate, due to the nature of averages.The average conversion rate in the USA is 2% whether you’re selling elephants or iPods. - Avinash KaushikClick To Tweet
On the other hand, historical data helps you see how your performance changes and how you can improve.
To illustrate this, let me tell you a story about running. I’m an amateur runner – nothing fancy, falling within the typical minutes/km and so on. If I look at amateur benchmarks, I’ll be happy to notice I’m close to the average – so not lagging behind. If I look at professional benchmarks, I’ll get depressed, as I can never dream of going as fast as 4 min/km. Neither of these is helpful.
So I look at my best time this season and try to beat it. Over and over again. Sometimes I manage to do it and I’m happy. Sometimes I don’t – and that’s where the interesting part comes in. The moment of asking “Why?” Was it the breakfast I had? Was it the weather? Was it that I changed my route? All of these can uncover something important and actionable, helping me improve.
Looking at your data
Measurement can help you uncover insights across the whole funnel. So let us go through it, top to bottom.
Who’s your audience?
If you already have an online presence and you’ve built your core audience, you can uncover a lot about your users through Google Analytics. The interesting part is not knowing who they are, but seeing how they interact with your content. Do Europeans convert twice as much? Are 18-24 year olds a large part of your traffic, but not a big part of your paying subscribers? Are there any issues with screen resolutions you need to look into? All this can be seen in Analytics’ Audience reports.
Here’s an example showing the geographies with highest signup conversion rate – we may want to attract more users from those countries:
Now, if you don’t already have your own audience, don’t fret – you can still get useful information from Facebook Audience Insights. The tool allows you to segment based on demographics, behavior, and interests and see how groups of people differ from the general Facebook population.
If that sounds too abstract, here’s an example. If I want to start selling mass ground coffee, I might look into what’s the typical audience of Lavazza. The capture below shows that some of my stereotypical notions are confirmed – IT people do drink a lot of coffee. But some of my hypotheses are disproven. Turns out the famous Italian coffee drinkers are not fans of Lavazza and I will be better off investing in the Romanian market. So data can help you both confirm and discard ideas – just remember to test them in the first place.
A sidenote here: if you’re targeting a B2B audience, you might want to give LinkedIn Website Demographics a try. It’s a tool that gives you additional data on your website’s audience through the prism of LinkedIn data like company size, seniority, and industry.
What channels work?
The second thing to look at is what channels bring you the right people. This is something quite useful, informing advertising decisions and your content distribution efforts. After all, it doesn’t make sense to invest time or money if you’re not attracting the right people.
So, looking at traffic sources, we need to be aware not just of the quantity of traffic, but it’s quality.
Here’s an example – that’s the first-day distribution of a recent blog post I wrote. If I was looking only at session numbers, I might conclude that StumbleUpon and Reddit are great sources. But looking at the session duration, it’s clear that those users are not interested in what I have to say. So making another distribution push there will be pointless unless I change the strategy.
What content works?
OK, we’ve got some people on the site. But are we succeeding in keeping their attention? Behavior data in Google Analytics will guide us there. What makes the most sense here is pairing a couple of metrics, as some of them are a double-edged sword, if looked at alone.
Let’s take Time on Page as an example. It might be good news, showing readers are engaged with your latest blog post. But it may equally be bad news, showing that your navigation needs work and people are completely lost in finding what they’re looking for.
So, when I look at Time on Page, I look at other metrics, as well – here’s an example with a recent blog post. Time on page is high and the low Bounce Rate means people actually go on and read more. So that piece is a real gem!
It’s not all numbers
So far we’ve been looking only at quantitative data and as I already mentioned, that’s not the end-all and be-all of measurement. We should also check what people think and what they’re having issues with. Qualitative tools like Hotjar will help you here. I find the tool instrumental when battling issues with inefficient navigation or high drop-off rates.
You can use quantitative data to pinpoint problematic areas like pages with high bounce or exit rates, underperforming steps in a checkout funnel or landing pages with poor results. Then you can use the qualitative info to form hypotheses.
So, when there’s a problematic step you want to know more about, you can actually ask your users! The coolest insights I’ve had were from people answering a Net Promoter Score survey or telling us what stopped them from converting in a specific session. I often get ideas for new blog posts out of comments left in the Hotjar survey on my blog (so when you see it, please, drop me a line!) Here are some ideas on how to use polls for additional insights.
Wrapping it all up
I can actually talk about measurement all day (I’ve had people kindly asking me to stop, too). So this is just a short intro to data and measurement. You can see how I combine all this to get a better understanding of my blog’s performance with the help of Data Studio.
Whatever you choose as your measurement framework, don’t rely solely on data-driven decisions. Sometimes the best results come from that outlier test variation you had a hunch about.
I wish you happy measurement and I’ll be thrilled to hear how you approach data!