On August the 1st Google launched a broad core algorithmic update nicknamed the ‘Medic Update’.
Whilst there’s been a lot of opinion about what the August 1st update might have been related to, very few have written about recovery and the correlation of recovery. We believe this approach, can better help with understanding causation than subjective commentary.
The reason for the nickname the ‘Medic’ update was because it appeared initially to be focused on health and medical sites. Barry Schwartz (CEO of RustyBrick) and someone who’s covered search obsessively for over a decade believed that the concentration of sites impacted was focused on health and medical sites and was the first to coin the ‘Medic’ naming.
It became clear afterwards though that this was industry agnostic. Correlation had been drawn but it was premature to think it was industry related.
This week we released a broad core algorithm update, as we do several times per year. Our guidance about such updates remains the same as in March, as we covered here: https://t.co/uPlEdSLHoX
— Google SearchLiaison (@searchliaison) August 1, 2018
What did Google have to say?
Googles ‘SearchLiason’ on Twitter is the voicebox of Danny Sullivan who was drawn out of retirement as a former journalist and a cofounder of Third Door Media, which publishes the ever popular Search Engine Land. The search community were initially excited about the addition of Danny Sullivan as Google Search Liaison and hearing him offer better transparency. Quite the opposite appears to be true however, despite previous assurances.
Sullivan simply pointed to advice from previous updates, saying there’s “no ‘fix’ for pages that may perform less well, other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages.”
“Build great content” has been the frustrating mantra that publishers and webmasters have heard from Google for well over a decade to hide behind any kind of transparency.
Google also said, “As with any update, some sites may note drops or gains. There’s nothing wrong with pages that may now perform less well. Instead, it’s that changes to our systems are benefiting pages that were previously under-rewarded.” After being pressed on the matter though, Sullivan finally did point to an indication of where to look for help with the quality rater guidelines.
Want to do better with a broad change? Have great content. Yeah, the same boring answer. But if you want a better idea of what we consider great content, read our raters guidelines. That's like almost 200 pages of things to consider: https://t.co/pO3AHxFVrV
— Danny Sullivan (@dannysullivan) August 1, 2018
In a hangout with Google’s John Mueller, a user submitted a question asking how soon could they see site wide changes after fixing content to better satisfy user intent.
Mueller answered by saying you could fix thing to make your site more relevant, but Google won’t be looking for specific fixes that lead to recovery. There’s no fixed timeline and it’s an ongoing process. Google will reprocess pages, reindex them and reassign signals they have.
We’ll come back to the August 1st update shortly, first we’ll talk about a property/real estate website Blue Array have as a client.
Shabesh.com, a leading Iranian property portal, came to Blue Array with major issues with their technical setup. Blue Array implemented best-practice and creative technical SEO changes that would give the website a better opportunity to rank for relevant keywords.
Whilst we were up for the challenge of optimising a website in Farsi, there’s no keyword data to be gleaned from any of the usual industry tools including Google’s Keyword Tool. We instead had the idea to use US data from those seeking property in Iran using Farsi keywords and this proved successful in understanding gaps.
Technical recommendations and implementations included:
Page speed – including optimisation of images, & improving server response times. Benchmark speed stats were 2.3 FCP, 2.5 DCL, with a ‘Low’ optimisation score (43/100 Mobile). Once recommendations were implemented, this improved to 1.1 FCP, 1.3 DCL and ‘Medium’ optimisation score of 60/100.
Mobile vs. Desktop parity checks – to aid with the upcoming Mobile First Index roll-out. Changes included updating menus, adding horizontal links, improving pagination to reflect the desktop setup, adding more text to the mobile version of the site and many other general usability improvements.
Pagination – this element was originally marked as NoIndex/Follow; Blue Array suggested removing the NoIndex to open up pagination to crawlers. A few instances of deeper paginated pages also had canonical elements pointing to page 1, which have since been updated to point to the correct location.
Meta titles and descriptions – including fixing issues with duplication and improving the lengths (i.e. increasing where too short and reducing where too long).
Internal linking – such as improvement of horizontal linking setup and updating breadcrumbs to point to more relevant hubs.
As a result of these recommendations being implemented, Shabesh.com saw an almost immediate positive impact to their organic traffic (implementation was early April 2018).
Comparing April to March 2018 (after implementation of technical recommendations), Shabesh saw a 156% increase in Google organic sessions.
When comparing Google organic sessions YOY, Shabesh saw a 1,248.74% increase, from 21,005 in April 2017 to 283,303 in April 2018.
The 1st of August for Shabesh.com
On the 1st of August Shabesh.com suffered a catastrophic drop in rankings and subsequent visits of -36%.
Analysing the issue, we looked at page types that had suffered the most in traffic drops and rankings. Very quickly a pattern emerged and it appeared that new pages that had been added for faceted navigation for square footage were suffering the biggest drop. Investigating further, many of these pages had ‘no results’ and were radially expanded to include results from nearby localities.
It made sense to us that in a quality focused update, irrelevant child pages or nodes on a search results page would suffer. If a page is optimised for Tehran and the nodes relate to a bordering locality instead, this would be poor relevancy for Google’s users and subject to algorithmic demotion.
We also felt that Hummingbird would do a great job of surfacing parent pages for the facets present on the page, rather than explicitly needing optimised pages.
On our advice, Shabesh.com very quickly implemented the noindex directive on these ‘no results’ faceted pages and four weeks later completely recovered.
CEO Shahriar Hojabr wrote to us recently confirming ‘we are back and even stronger, I should thank you for your support during difficult times’ and ‘we are proud to have worked with Blue Array’.
In contrast to the advice of Google, there can be relatively quick and simple fixes if you can identify where relevancy may be an issue. As demonstrated, this can be as straightforward as the implementation of noindex directives. It certainly helped that we had a large scale website where patterns emerged quite quickly on database driven pages. Nonetheless, we hope sharing this information might help others.