On June 2, Google made an unprecedented move: The company announced a broad core algorithm update a day in advance of the actual rollout.
Over the next several months, you’ll undoubtedly see many post-mortem analyses of the algorithm update, with ideas on what might be needed to recover. Having helped many sites recover from Google’s core algorithm updates, I’ll share the four most common issues we see when clients seek our help with an algorithmic penalty.
Since Google’s “Fred” update on March 7, 2017, one of the key things we’ve observed with core algorithm updates is that they seem to be an incremental tightening of criteria related to these four key areas. Every algorithm update seems to tighten the noose a little more, eventually bringing down sites that performed well for years.
Low-Quality Or ‘Thin’ Content
Most of the popular content management systems, such as WordPress, offer handy navigational features that can create a large number of pages with a low word count. Since Google’s earliest Panda update, though, it seems that search algorithms prefer pages with more words on them. We’ve found that sites with a higher percentage of pages with low word counts will sometimes struggle in the rankings. While greater content length doesn’t always correlate with better content, it’s hard for great content to be under 200 words.
When a client comes to us, we immediately scan their site for pages with a low word count, or “thin” content. On many occasions, deleting these problematic pages can result in significant ranking increases.
With WordPress sites, installing a simple word-count plug-in is the easiest way to find problematic pages. For all other sites, website crawlers and site auditing tools can instantly show you these pages. Additionally, using the “site:domain.com” operator search in Google helps find weird doorway pages or image attachment pages that Google may have indexed in error.
Bloated Site Design
Website users demand speed, and updates to Google’s search algorithms are challenging sites that have large images — which can cause pages to load slower — and offer a poor user experience. If a competing site delivers a faster, better experience, then it’s likely that the bloated site will consistently struggle for search engine placement. To remedy this, trim your pages down so that each page loads in under five seconds.
Site speed is rarely a stand-alone issue. As reported by Search Engine Land, aggressive advertisements also have been cited by Google representatives as being a potentially negative ranking factor. Removing ads and shrinking image file sizes are sometimes all it takes to get load times down, improve the user experience and increase site traffic (typically following the next algorithm update).
Keyword usage is one of the first things we check after an algorithm update, and we’ve consistently noticed that overoptimized sites — those that use a high number of keyword — tend to move lower in search rankings over time with successive algorithm updates, while more conservatively optimized sites tend to move upward.
We’re not fans of being overly focused on keywords. However, keywords can demonstrate the relevance of your site to algorithms.
Accordingly, assessing problematic pages to determine their keyword density has proven to be an excellent way to recover from algorithmic penalties. By comparing the keyword density of your pages with that of the top three pages on Google, you can get a fair idea of how many primary keywords you should be using on each page.
While you may have never engaged in link building, it’s possible that your site has still acquired many backlinks. Backlinks were the original ranking signal for Google, and they still play a significant role. If your site has a large number of low-quality backlinks from “questionable neighborhoods,” it’s possible that your site won’t fare as well after core algorithm updates. While we’re not disavowing as many files as we used to (by uploading a list of harmful backlinks via Google’s disavow links tool page), we still see the impact of poor quality backlinks on websites.
Using Google Webmasters tools, as well as third-party backlink checking tools, you can see what sites are linking to yours. Examine these sites for relevance — how related are they to your site? Also, look for any indication that these sites may be violating Google’s recommended guidelines. Sites that are not ranking well in Google (or not indexed) should definitely be approached with caution as it may indicate that Google has a quality problem with that site. Ideally, your backlinks will be coming from related sites with evidence of good traffic (i.e., sites that get a lot of social shares).
Getting high-quality backlinks is the best defense against low-quality backlinks — it’s even better than trying to disavow low-quality links. Seek partnerships with related websites and blogs, and find ways to collaborate to help provide useful content for their readers in exchange for your ability to get greater visibility for your site.
While we’re seeing many high-authority websites being rewarded after Google’s algorithm updates, we’re continuing to see the algorithms be less forgiving of simple mistakes. Sites can recover from algorithmic penalties by optimizing their pages and establishing more domain authority — or a better online reputation. An eye for detail and a system for monthly improvements are the keys to staying ahead of search algorithm updates.