How to Make Sense of the New PageSpeed Insights
Posted by Tom Pool on January 20, 2018
Google Algorithm UpdatesSEOSite Performance
Google Update PageSpeed Insights – A Review of the New Features
Google announced speed as a ranking factor, albeit a minor one, almost a decade ago and it’s therefore critical for publishers to understand what Google’s view of speed looks like.
We’ve therefore taken a great interest when Google recently announced an updated PageSpeed Insights Tool – a free tool which allows webmasters to gauge how fast their websites are loading for users in Googles view.
According to Google: “PageSpeed Insights will use data from the Chrome User Experience Report to make better recommendations for developers and the optimization score has been tuned to be more aligned with the real-world data.”
We’ve known for some time that Google were using ‘real-world’ data so it’s interesting that this is only now being exposed to publishers.
New features on the tool include the addition of Real World Performance Data, from the Chrome User Experience Report, that should help provide add more relevant insight to your page speed checks.
Some new metrics have also been added – FCP and DCL, that add another dimension to this tool’s reporting. Each of these metrics are ranked either ‘Fast’, ‘Average’ or ‘Slow’, providing a different view of page speed.
However, the updated tool has had some mixed feedback from those in the industry, with many saying it’s lacking detail in the data to make informed decisions.
For this reason we decided to take a in-depth look at the tool to see whether the changes they have made are indeed positive.
A review of the new features
Looking at the initial page when you go to the tool, the user interface appears unchanged.
Putting in a URL is the same, and the tool will think about it for a while, while it processess your request. I decided to analyse Google’s announcement URL, to see how they optimise their own content:
So what has changed?
However, when the results come back there are some distinct differences:
As the above screenshot shows, there are now 2 metrics in place:
- Speed (which is unavailable in this (and many other) examples)
- Optimisation – which shows a 68/100 score (still needs to be faster Google!)
So, ‘Data about the real world performance of this page was unavailable.’ However ‘Pagespeed Insights was still able to analyse this page to find potential optimisations’.
Following the recommended link takes you to an FAQ page:
From this, it appears that Google won’t be able to provide data for URLs that it doesn’t deem to be “popular” – great for those with larger sites, but it does seem that Google is almost ‘punishing’ smaller sites in a way, by not allowing them to view what they might have access to if they were a larger brand.
If that data is really needed, Google recommend using their Lighthouse tool to test your pages manually. However, this can be quite a daunting process if you are not comfortable with the developer console, and it can also spit out a load of scary looking colours and ‘errors’.
Looking back at PageSpeed Insights:
We have an additional piece of text at the top of this report, that gives us some information about the round trips and the amount of MB needed to fully render the page. Google then kindly provide us with some median metrics, that provide us with a benchmark to ‘aim’ for.
We can see that this page is 0.9MB smaller than the average page, but takes two more round trips to load the render blocking resources that are in place.
This is more information than we had previously, and already we have some actionable insights:
- To reduce the number of round trips (more important – in this case)
- To make the page lighter (less important)
The tool then (as usual) gives you a list of optimisations that can be performed to improved the ‘optimisation’ of the page, and any optimisations that it recognises have been carried out.
Looking at the ‘Desktop’ tab of this report:
We can see that ‘Speed Data is available for https://developers.googleblog.com’, which while not terribly useful for the page that I want to investigate, does seem rather interesting:
We can now see that the ‘speed’ area has an overall ‘Average’ and this score is split into two sections – FCP and DCL. These two “mystery” metrics are provided by the Chrome User Experience Report, and indicates the median value of these 2 metrics, ranking it in either the first, middle or last third of all pages.
What is FCP & DCL?
FCP – stands for First Contentful Paint, which measures the time it takes for a user to see a visual response from the page. Understandably, “Faster times are more likely to keep users engaged”. This is especially true, as I prefer to see a website ‘do something’ rather than me just stare at a blank page that is supposedly “loading” for a few seconds. Seeing the browser start to ‘do stuff’ is more reassuring for users than seeing no activity at all.
DCL – stands for DOM Content Loaded. This measures “when HTML document has been loaded and parsed”. This does not mean that the page has loaded, rather it means the main HTML of the page has been completely loaded and ‘read’ (parsed). The browser than has to finish loading stylesheets and images, to fully render the page. The more render blocking resources there are, the longer this will take.
Mozilla offer a small piece of advice around speeding this up:
“Some things you could do is turn your JavaScript asynchronous and to optimize loading of stylesheets”.
Basically, the quicker each of these two events is reached, the faster the page is seen to be.
What if I want the old data – with a simple /100 page speed score?
Fortunately, if you really want this data, there is a way to access it! One way is to use URLProfiler to analyse a URL (or group of URLs), and selecting the ‘Page Speed’ analysis options.
This tool plugs into the Page Speed Insights API, which still gives you that lovely ‘old’ data.
If however, you want to build your own program, or just want to see the physical output that you get from the API, Google allow you to play with the tool in their API Explorer:
Finding the actual tool takes a bit of searching:
Go to the second version (v2) of the API:
Then all you have to do is plug in your URL,
Add the ‘strategy’
And then click on execute:
The tool will think for a few seconds, and then give you a score (out of 100) for the speed – and usability too, if you are testing with the mobile strategy:
Conclusion
Google seem to be applying more weight to the FCP and DCL metrics, and pushing them to users of PageSpeed Insights more. This seems to be that they are focusing more on User-Centric metrics, and asking us (through Pagespeed Insights) to improve these two areas.
While this is an interesting update, if the page is not deemed ‘important’ enough by Google, you get no overall speed data. While a lot of people are understandably disappointed by this, I have a bit of a mixed reaction in response.
If you are one of the lucky ones and have real world data for your tested URLs, it looks to be an improved version of the tool, providing far more data than available previously.
For the smaller sites that aren’t seen as ‘important’ the functionality is still a bit lacking. I guess we’ll just have to wait to see if Google starts being able to analyse more pages and providing more ‘Insight’ into the speed score of all websites.
There are however some additional insights for smaller sites, with round trip and page size data displayed; more than was previously available.
Ideally, I’d like a PageSpeed Insights that offers both the FCP & DCL metrics for more pages, as well as a score out of 100, that takes into account both of these new metrics.
If you really need further insight, using the API will give you the more ‘traditional’ results, and Lighthouse will provide you with a great level of actionable detail.
Beyond PageSpeed Insights;
Site speed is still a very small Google ranking factor along with the over 200 other signals. ‘Content and relevance’ are still primary. However, the impact of speed on bounce rate, conversions, social shares, backlinks and innumerable other second order effects (second order to site speed as a ranking factor) are well documented and worthy of our full attention.
Here’s some other tools worth utilising as well to help in your site speed improvement efforts.
- https://www.webpagetest.org/
- http://yslow.org/
- https://gtmetrix.com/
- https://tools.pingdom.com/
- https://varvy.com/pagespeed/
- https://www.uptrends.com/tools/website-speed-test
Hey Tom,
Great article you put together here. The Pagespeed insights tool hasn’t been terribly useful for me from a speed aspect. I just made sure my website scored a 100 on mobile and desktop and moved on to be honest. I’ve been using tools like gtmetrix and webpagetest.org to test load time and perform waterfall tests. I was wondering what your thoughts were about these tools.
Cheers,
Drago
Hey Drago, Thanks for the feedback – It’s greatly appreciated! Congratulations on achieving that elusive 100! I think that the PageSpeed Insights tool is good for delivering specific optimisation recommendations, such as which images to optimise & what other areas can be improved. GTMetrix is great for some of the more in depth & technical areas, and as you say, providing a waterfall view of the page. I also find that Google Developer Console is good for providing that data – although it can look fairly complex! WebPageTest is also pretty good for providing the waterfall view, and can really… Read more »
I hadn’t heard of speedchecklist.com. Thanks again Tom!
It doesn’t seem to work on local or dev sites anymore, so, what is the point?