In September 2019, I took to the main stage at the always amazing BrightonSEO conference to share experiences, tips and suggestions for optimising forum content – an area that I feel is a little-discussed area of SEO right now, despite it offering amazing opportunities to bring traffic into a website.
You can see my presentation again below…
Here’s a roundup of some of the main points from my presentation, including some tips I had to cut out due to time constraints!
How can having a forum benefit SEO? [jump link]
Common SEO issues with forums [jump link]
Content duplication on forums [jump link]
Large forums = crawling issues [jump link]
The role of trust in forum SEO [jump link]
Other considerations [jump link]
Forums are, potentially, a fast way to generate content that’s relevant to your target audience. Whether you’re adding a forum to a product website or creating a standalone community on a particular theme, you’ll be more likely to rank for longer-tail queries and semantic variations on head keywords.
User-generated content is also faster to adapt to, and draw on, topical or trending terms. This hugely increases your reach and therefore opportunity to draw in organic visits.
Other benefits of creating a forum
- Keep your target market on your website for longer by giving them masses of content to consume
- Open up a new dialogue channel with your customers in which to support them
- Increase brand awareness by attracting visitors not initially intending to visit your website
- Provide an avenue for market testing
- Create an owned channel to market future products and services.
At Blue Array, we’ve worked with a range of forums, whereby they either served as the main part of the website with some static content alongside, or as a small part of the on-site experience. Generally, they’ve tended to appear on the more enterprise side of website scale, with many thousands or millions of indexed pages.
Many forums have also widely been hit by some significant algorithmic drops over the past year, namely the so-called ‘Medic’ update in August 2018 and the March 2019 core update that appeared to largely impact websites dealing with YMYL (your-money-or-your-life) topics.
Working with our clients, we delved into potential reasons why their forums might not be performing as well as they’d like, and the resulting issues tend to fall into three main areas: duplication [jump link], crawling [jump link] and trust [jump link].
Forum users are notorious for creating duplicate threads targeting similar topics. Thread titles, and therefore meta titles, can also be incredibly vague, giving search engines (and searchers) very little to go on in terms of context or relevance.
Whilst this isn’t necessarily the fault of your everyday user, this can cause issues from an SEO point of view, such as keyword cannibalisation, poor ranking and a lack of user engagement.
Here’s some ideas for tackling content duplication:
- Impose a minimum character limit for thread titles. Do not allow the user to click ‘submit post’ until this limit has been reached.
- Bring in the forum section name and the name of the website to add a little more contextual information.
- Use autoprompts if you have the functionality: suggest existing relevant threads to the user that they might like to contribute to instead of creating a new one.
- Assimilate part of the post body copy into the meta title in order to increase the character count and provide more useful information to search engines.
- Show users thread title guidance on the ‘create post’ screen – remind them that it’s in their own interest to create a descriptive, engaging post in order to get the visibility and level of discussion they’re looking for.
- Make it easier for users to find existing, relevant threads using an internal search feature – and name your forum sections accurately to show users where to find existing information on what they’re interested in.
As with any exceptionally large website, we’ve seen forums with a number of issues relating to crawling (and therefore indexing). What’s different about forums is that they have the potential to grow exponentially – in fact ironically, the better you do at getting people to post and engage with content, the more difficult it’s going to be to manage your crawl budget.
Here’s how to tackle crawl budget issues on forums:
- Use separate sitemaps for each section, and store them within a sitemap index – this will help you identify and isolate any particular areas that are struggling, and help you de-bug them faster.
- In a similar vein, completing some in-depth log file analysis is recommended, again looking at areas of the site winning and losing when it comes to regular crawling – we suggest taking a minimum of two weeks’ worth of data. Check out this article by our very own Technical SEO Executive Vicky Mills on log file analysis to find out more.
- Consider your internal linking – are pages being crawled less because it would take 800 steps for a crawler to get there? Are we linking to our most valuable pages clearly and regularly?
- You can increase the crawl rate of more deeply situated pages by integrating modules in your page wireframes – this could include ‘recent’, ‘popular’ or ‘trending’ posts.
- HTML sitemaps are an underused method of directing crawlers to deeper pages – examples we’ve seen include linking out to all categories and subsections, while others have actually selected most popular threads, or those which are likely to bring the most value/opportunity in terms of targeted keywords, in order to boost their profile.
- Don’t over-paginate. Ensure that your rules allow for 20 to 30 replies per page before generating a new page in the series.
- Deal with content cruft! When dealing with a large amount of content and deciding what could be removed, it’s helpful to set benchmarks in order to do this efficiently:
- Freshness – when was the original post made live? When was the last reply?
- Engagement – how popular is the thread, is engagement steady over time or has it dropped off?
- Relevance – if threads relate to time-bound events, is it worth keeping them when the content is out of date? Examples: Queen’s Diamond Jubilee television coverage, new Amy Winehouse album release.
- Search volume – conduct some keyword research and then measure your existing content up against the potential opportunity.
- Value – make sure that pages with no SEO benefit are not crawlable, including internal search results pages, sporadic contributor profile pages, parameterised etc.
Very out-of-date discussion on a Doctor Who episode dated 23rd April 2005 – still crawlable!
- When it comes to retiring pages, there are a few options:
- 410 them – if they have no value and are really old or spammy, consider getting rid of them permanently
- Block them from being crawled via robots.txt – this will be easier if you have consistent URL structures!
- Create a data/login wall to prevent crawlers from accessing archived or older content
*NOTE: It’s imperative that you check pages you’d like to remove for any valuable incoming link equity; always act with caution!
Although Google doesn’t particularly like to talk about the rather subjective concept of ‘trust’, as SEOs we know that it’s crucial to success that our website contains quality, authoritative content in order to rank competitively. Letting the general public define the content on your website, while efficient, is also inherently risky from a trust perspective.
Based on algorithmic changes, and our own direct experience, we believe search engines look at three main areas when considering a site’s trustworthiness:
- the link graph – who we are connected to and how
- content quality – in-depth, well-written articles containing verifiable information
- site hygiene – how many issues crawlers encounter when visiting our pages.
We also know that the impact of letting a website’s trustworthiness slide has potentially severe consequences that can impact a business’s bottom line. Here’s a snippet from Google really driving that home:
Low-quality content on some parts of a website can impact the whole site’s rankings.
Spam can distract and annoy your users and lower the reputation of your site.
Unintended traffic from unrelated content on your site can slow down your site and raise bandwidth costs.
Google might remove or demote pages overrun with user-generated spam to protect the quality of our search results.
Source: Google Webmasters
Here’s how you can improve trust signals on a forum:
- Use moderators – whether on your payroll or just trusted contributors, they can make a huge difference to the overall quality and trust of a forum by:
- closing or merging duplicate threads
- banning trolls and spammers
- bringing threads back on-topic
- reminding people of the forum rules.
- Blanket rel=nofollow all outbound links within user-generated content – far too often, a forum will be brought down by repeated spam comments.
- Review your backlink profile regularly and disavow harmful links through Google Search Console.
- Use referencing and citations throughout factual content to make it more reputable. Here’s an example from healthline.com illustrating a good approach to this:
- Find your expert users and make more of them:
- Use them as moderators to help you fight poor quality content.
- If they are experienced in (or professionally tied to) a particular relevant industry, make their profile pages thorough and in-depth. We really feel that Google, in particular, is moving towards a more entities-based approach where signals relating to trust are built up not in isolation on a single domain, but as a whole structure across the internet. Examine.com do this really well as you can see below, linking through to Wikipedia references for the author as well as .gov or .edu sites showcasing their published work.
- Encourage participation using gamification! Lots of good forums do this, offering benefits to those who regularly engage with content, and offer quality replies or posts. Moz do this really well with their MozPoints system, and getting people to think about their online reputation can genuinely benefit the site.
- You could also consider ringfencing certain sections (most likely your highest risk/value areas for YMYL type queries) so that only expert users can contribute, safeguarding quality in this area.
Check the device most favoured by your users to access the site – as we move into a mobile-first world, we’ve found that ensuring a positive mobile experience for both crawlers and users is essential to forum success. This means:
- Checking mobile site page speed
- Ensuring there’s navigational or internal linking parity between desktop and mobile versions of the site (if not responsive)
- Avoiding intrusive interstitials.
Of course, some forums like StackOverflow will be more heavily accessed via desktop, so it really is important to keep your audience in mind when tweaking and optimising.
Alongside ‘DiscussionForumPosting’ Schema, In 2018, a new type of structured data was unleashed, called ‘Q & A’, giving forum owners further opportunity to claim some of that all-important real estate in SERPs, with additional rich results visibility. Make sure to use this only on threads where the thread author is asking a question – for example, discussions like ‘Luther Series 5 UK pace’ or ‘Growing roses in the autumn’ aren’t suitable.
Forums are the original home of conversational acronyms. Here’s a sample of some of the most common:
- IIRC – If I remember correctly
- AFAIK – As far as I know
- IMHO – In my humble opinion.
If you have industry-specific acronyms you’re definitely going to want to investigate a way to automatically change it to the full text to aid with keyword targeting. We’ve worked with a client where this has been possible as the user is typing, to prevent labour on the user’s part, but aid with context and clarity for search visibility.
Forums are a hidden goldmine of static content ideas. Make the most of repeated questions or topical discussions by using user-generated content to form the basis of some really in-depth, quality pieces of content. Do this well and you’ll be creating some brilliant backlink magnets, funneling in both authority and referral traffic from linking domains.
As always, remember to use internal linking effectively in order to boost the ranking potential of your most valuable pages.
Measurement and analysis
The most important thing to keep in mind when designing a forum structure is to use logical, consistent URL paths. This makes analysis and management of the site infinitely easier, and in many ways, this is just common sense and good practice.
Whereas with most informational or article-focused websites you can track some user behaviour, on forums the potential level of data analysis is huge. Make the most out of it by setting up event-tracking engagement (thread starts/comments/likes/shares), account maintenance and most visited sections/areas.
Forums can be a lot of hard work – especially if they’re popular and generating engagement all day long. They’re a labour of love, but the payoff if effectively and regularly managed and optimised, can be exceptional.
To make the most out of your forum from an SEO perspective, pay attention to specific common issues such as:
- Content duplication
- Crawl management
- Trust and quality
- Mobile experience
Attention to detail here will help you get on track for success.
Let me know if you have any questions, or if you’ve found anything works particularly well when dealing with forums!