4 important insights I’ve gained about SEO reporting
Posted by Stephanie Whatley on November 3, 2020
They say ‘if you can’t measure it, you can’t manage it’. Whilst not everything is easily quantifiable, there’s a lot to be said for understanding the metrics that matter, why they’re important and how best to draw insights and actions from them.
As an SEO specialist, having worked through several large-scale migrations and on huge sites, my love of reporting has come on a journey from ‘oh God no’ to ‘hell yeah’ over the last few years. Here are a few of the most important lessons I’ve picked up along the way.
1. Correlation ≠ Causation… but both are valid
Page one in every analyst’s handbook – correlation does not equal causation. Repeat to fade. Using correlative patterns as a basis for an SEO strategy or tactic is usually seen as asking for trouble, and for good reason. Often, leaping to conclusions without exploring the data further:
- distracts you and steals focus from more impactful issues you’ve not discovered yet
- could lead to corrective actions that further impede organic progress.
Of course we would all love to see a clear cut, irrefutable reason for traffic fluctuations – the scenario most likely to let us quickly deduce the cause is where technical changes are deployed (albeit sometimes unknowingly!). The chart below shows how the removal of hreflang from a US subfolder on a UK site battered traffic very quickly in spring 2019:
Google doesn’t let us know the specific site areas that are being targeted in any new broad core updates – and we know that mostly, organic performance does not hinge on a single metric or ranking signal (the above example excepted). To suggest that is the case is hugely reductive and oversimplifying of a complex, constantly-changing ecosystem. Moreover, if that were the case, it would be gamed by absolutely everyone in the touch of a button.
With the above in mind, we can conclude that for the most part, analysing organic site performance should be done holistically – and there isn’t always a single, obvious reason for uplift/depression – ergo, correlation is sometimes the best insight that we have to hand.
Here’s an example. In a single week, we gain a new fantastic follow backlink to our product page. We amend the copy to mention that we’re now offering a 10% discount due to the end of season sale. Google tweaks its algorithm as it does several times a day. The web devs 410 an old page that linked internally towards it.
Our ranking position drops by 6 places, our traffic remains stable and our impressions rise.
Are we able to definitively a) discover all this information within a reasonable time frame (even the best website monitoring tools like Content King will have a slight delay in finding such changes), b) attribute these outcomes to specific actions and c) generate any actions off the back of it?
If the answer is no, you need to use correlation – with caution, and do the best you can.
Tips for dealing with a lack of definitive answer:
- It’s ok to use correlation, as long as you acknowledge that it’s the best guess based on the insight and information you have, and you handle any next steps carefully.
- Annotate! Google Analytics lets you do this easily, so you can see what has been happening when and work backwards. You can also create a log of changes in Sheets and link them to Google Data Studio reports.
- Similarly, talk to your devs, talk to your product and marketing teams, and find out what’s happening elsewhere that may have impacted performance.
- Communication is key. It’s even better if you have access to the development backlog!
- Try to deploy one thing at a time where possible in order to better isolate and explain the impact as far as you can.
2. Different teams need different data to do their best
Moving onto the next shocking revelation, C-suite colleagues are probably not interested in log file analysis. It’s easy for us as SEOs to punch everything into a single report, fire it off and let it speak for itself no matter who looks at it – but in my experience, adapting reports to fit the needs of different teams helps to:
- Increase the speed and likelihood of implementation/action
- Improve relationships and retention (both in-house and agency-side)
- Highlights progress (and success) effectively to your peers and senior decision makers which can potentially unblock resources to further benefit organic performance.
Talking to your clients or stakeholders at the outset of establishing an ongoing reporting system is crucial, to really find out what they are interested in and what SEO information they need to do their job better. I’ve added some examples of how these metrics may vary by team below – but there will undoubtedly be more to add:
There are lots of ways to do this – our tool of choice at Blue Array is Google Data Studio as it allows for an easy to use paginated navigation, with a page for each team – also it provides the option for them to look at other pages if they want to.
Tips for working with teams on more bespoke SEO reporting:
- Talk to these teams! The best way to get the information to make your reports perfect for them is to find out their thoughts – pain points, current projects or issues, and what they include or look for in their own internal reports.
- Frontload and automate the work as much as possible. Effort spent upfront pays off in the long run when you are able to use parameters, filters and date samplers to quickly update your reports and draw out those insights.
- Review your reports with your stakeholders regularly. Staff churn, campaigns and emerging priorities can all impact the usefulness of the report, so re-align at least quarterly to make sure it’s still fit for purpose.
3. Data blindness is real
One of the biggest challenges when you love reporting is falling down the rabbit hole, and being so overwhelmed with data and the minutiae that you can’t find a path forward to extract anything useful or actionable. While it’s great that in 2020 we have access to so many amazing sources of data, occasionally we can be completely overloaded by the volume of information we have to work with.
The reality is, most of the data we have access to is useless for helping us achieve success through SEO strategies and deliverables. It’s not about acquiring a huge amount of data, it’s about acquiring the right data. Thinking smart, instead of thinking big.
This starts with being clear about what questions you want to answer, before diving in.These questions may become clear as a result of daily checks and witnessed fluctuations, news of a huge algo drop, or a new round of implementation – or they may just be ‘what’s new and interesting?’.
Tips for dealing with data blindness:
- Make sure you have an idea of what you’re looking for before you start wading through the data
- BUT beware of confirmation bias. Our questions should be open, so that we don’t immediately rush to find only the data that supports our theories
- Blending data from different sources is a fast way to see impact at scale and the relationships between metrics.
4. Some metrics are meaningless (without context)
If you’re not overly familiar with the term ‘vanity metrics’, it’s basically a term for numbers that don’t really tell you anything valuable. You can’t helpfully report or draw out any actions or recommendations from them. Many metrics actually fit this bill, but mainly because on their own they lack further context to help give them meaning and clarity.
Here are some examples of common vanity metrics:
Keyword rankings – commonly referred to by clients as their ultimate goal for SEO, ranking in specific positions does not necessarily guarantee ROI. A sizable percentage of search terms are completely unique and have no search volume precedent to factor in, nor is there any guarantee that certain terms will convert well. SERPs are sometimes hugely overcrowded with rich results and other features, so CTR for position 1 may be far lower than anticipated.
Overall bounce rate – taken at site level, this really doesn’t tell you much about engagement. Some areas of a site may have an extremely high bounce rate, especially if they’re low performers (eg 2 visits and 1 bounce will give you a 50% bounce rate. Across a large number this will hugely skew your figure!).
Backlinks acquired – it’s not enough to say ‘this is how many backlinks we got’. How many did we lose (a natural part of the link lifecycle)? How valuable are they in terms of the referring page/domain? Is the anchor text or context relevant and positive? Now that Google has introduced more link attributes, are they genuine follow links?
The list goes on. One way to address metrics like these is to ask the question ‘what does this tell me and what can I do to improve this?’. If the answer isn’t obvious, it’s probably because you need to look deeper and understand the wider picture. Good reporting is about telling a story through data, and understanding how this information plays a role in helping us achieve our goals. If their performance won’t help that, and it’s not relevant for other objectives or KPIs, should we spend time reporting on them? Maybe not.
Tips for dealing with vanity metrics:
- Always ask WHY. Why is this important? Talk to your stakeholders or clients about what value they see in knowing this information.
- Experiment with user/behaviour segments in Google Analytics. There are some ready made out of the box ones for you to use, such as converters or new/returning users. But it’s relatively straightforward with a rudimentary understanding of Google Tag Manager to create much more sophisticated ones relevant to your goals – for example those who took a specific action but didn’t convert, or revisited less than 5 days after their last session. Doing this helps you look more closely at how qualified your traffic is, not just the amount of it.
SEO reporting is always going to be important, whether you enjoy it or not, as it ultimately holds us and our stakeholders accountable for realising success, and helps us hone in on issues to be fixed quickly. It’s also a minefield of contradictions and black holes, so staying level headed and forging a clear path through the data will be pivotal to staying sane and getting the most out of the process.
The last and possibly most important insight – is that SEO reporting is only valuable if there are actions off the back of it – reporting for reporting’s sake will do absolutely nothing to move the needle, and time spent should be worthwhile in terms of ROI, whether that’s additional resources being allocated to help us achieve our targets or more meaningful conversations to assist with better market placement and targeting. Reporting should always equal action.