<<Home
Google Core Update December 2020
Google Core Update December 2020

Google Core Update December 2020 Makes Search Less Relevant

By
Posted on 12/07/2020

Story or Event Date:
Monday December 07, 2020
Post # 2935 posted in:
Community - News - Tech
Location:
1600 Amphitheatre Parkway, Mountain View
San Francisco, California, United States

The Google Core Update released on December 3, 2020 has unfortunately made search engine results pages (SERPs) less relevant and of poorer quality for some queries. We found this out today after looking at the Google performance of two sites in our network between December 4th and December 6th. For those dates we noticed a significant decline for several queries including the following two on two different sites:

1. "Snitch List" (without quotes) on CopBlaster.com (https://copblaster.com/snitches/)

2. "Celebrities with Herpes" (without quotes) on STDCarriers.com (https://stdcarriers.com/famouspeople/celebrities-genitalherpes-1.aspx)

These queries and landing pages have several things in common. Those commonalities include that such searches are usually conducted by users seeking lists of people for which the relevant characteristics apply, the lists on the sites are either the largest lists of people with the relevant characteristics available on the internet or close to them, the landing pages on the sites are category listings of posts on the source sites, and they are both hosted on a server due to be upgraded for performance reasons.

"Our mission is to organise the world’s information and make it universally accessible and useful." - Goolge Inc. (https://about.google/)

Google's goal is to provide its users with information that is most relevant to their search. To do this Google uses advanced algorithms in an effort to figure out what the user is looking for and to give them the results most useful to them. Unfortunately, a lot of people figured out long ago that they could trick Google into thinking their pages were relevant to what users were looking for. This has led to a constant struggle between Google and webmasters in which Google tries their best to eliminate poor quality pages from their results. Usually Google succeeds at this goal and serves users with the content most relevant to their search, but sometime Google screws up and pushes more relevant content below less relevant content due to other indicators they use to judge content quality besides just what keywords are in what places, how much content exists, and how many websites link to that content. As a result you can have all the right keywords in all the right places with rich content that lots of people link to without doing well on Google. It used to be that you could outrank more reputable sites by putting the right keywords in the right places if you competition was not, but semantic search has changed that. Today people are more likely to get results full of like terms from sources Google prefers in general even if that means serving less relevant results to users. Other issues such as page speed, editorial quality, and the quality of irrelevant content on the same site play a role. If your site is too slow it will show up below inferior content just because Google does not want to make users wait too long for it to load. If your content has all the keywords in all the right places with a lot of text, but Google's algorithms determine that the editorial quality of that text is poor (ex: misspelled words, duplicate content, generic content, content farmed just to create content that looks good to bots) then they give users less relevant content of higher quality. If your site has a lot of content that Google considers low quality they might not even bother to see how good a specific piece of content is just because other content on your site is considered poor. The end results are search results lacking in relevance.

Snitch List

The Cop Blaster Snitch List is the largest list of police informants available for free on the internet. It features far over 10,000 profiles of people that have given information to law enforcement. Most of those are from government records made public under FOIA (https://copblaster.com/coronavirus/). What could stop the largest snitch list on the internet from being number one on Google for "Snitch List" (without quotes)? We are trying to figure that out because that list is now on page 9. We know that all the right keywords are in the right places and that there is plenty of content, so we are thinking that the problem is probably due to server performance or a bug falsely flagging the content as poor.

We noticed a spike in poor performance from Google's Core Web Vitals over the days leading up to and immediately following the update. Our LCP scores were in the red and we think that is the biggest factor here. LCP stands for Largest Contentful Paint (https://web.dev/lcp/) and refers to the point in the loading process when the main content has most likely loaded. We do various things to speed this up like lazy loading of images and videos, but that seems have actually lowered the overall score even thought it improved the overall user experience. Google did not always process javascript the way it does now. It used to be that if something was not loaded in the source code that Google did not count it. That is not the case anymore, so you can get a quick speed score by deferring image and video loads until after the main content is loaded. Google now waits for the images and videos to be lazy loaded before scoring the page for speed. Strangely enough our LCP scores were higher when users had to wait for images and videos to load as part of the main page load even though deferring them makes browsing the site much easier for people. Recently, we added over 8,000 new pages from a collection of data that for the most part had not been published anywhere that Google indexed and the data that was duplicative we were able to make unique through string manipulation. Unfortunately, those additional records led to an increase in load times and sometimes the server was running out of CPU and RAM. That is why the server is being upgraded. We think that if we upload the server and get it running faster than it was before the new content was added that we should be able to recover.

After addressing the speed issue then the only problem is content quality. There are thousands of pages that were generated from user submissions to government websites and the quality of their work is not very good. We do have a lot of good content created by CopBlaster.com itself and that does fairly well for names of individuals being discussed, but the data dump from the COVID-19 snitches is not of high quality. It is however unique and relevant, so we don't see why that should hurt the performance of the site for a query like "Snitch List." Unfortunately, users hate writing, so there are several poorly written user generated snitch reports. Deciding whether or not to allow users to post has its ups and downs. The upside is that your site gets more content and people link to you more, but the downside is content quality. The benefits usually outweigh the drawbacks, so we won't start censoring users anytime soon. Again, we are hoping that even though there is content of questionable accuracy created by users that most of the content is still reliable, so it shouldn't lose relevance in the eyes of Google.

The site has some other problems in that we recently put no index meta tags on all but a few selected tag archives. We had ended up with thousands of thin tag archives a few months ago, so we decided to no index tag archives unless the tag has been white listed by the administration. Tag archives are great for organizing content beyond categories and locations. For instance we have a good archive for Assistant United States Attorney Greg Nyhus (https://copblaster.com/hashtag/greg-nyhus/) that Google ranks above all individual articles in the archive, so we want to keep that result. On the other hand we want to avoid results such as the hashtag for the name Greg (https://copblaster.com/hashtag/greg/) because thousands of broad tag matches like that or thin matches for more unique items produces spammy looking poor content. The ranking of the Greg Nyhus tag seems to be an anomaly because Google seems to far favor long articles about people and topics to archives even when those archives link to many more pages with images of the topics. You can see this in the "Snitch List" (without quotes) results (https://lmgtfy.app/?q=snitch+list) where news articles about snitch lists are outranking actual snitch lists. This is because such articles are from places Google considers high quality so even though they don't even have such lists their articles are ranking better. We are hoping that by getting rid of tags not related to a specific unique and useful archives that we will do better. Another thing we might get rid of is all the individual profiles for the police decertification database (https://copblaster.com/decertified/). We think that having thousands of pages where the profiles are autogenerated from thin Excel sheets might be hurting performance more than helping it even though for many of those profiles we do quite well for searches for those individuals or combinations of their names with their previous places of employment. Unfortunately, not many people look for those individuals, so if redirecting those profiles to just pages with all the names on them makes the rest of the site do better we might do that. We also noticed that there is an issue with listings in that section for departments competing with searches for police in specific areas with other parts of the site that lists posts from the main section geographically.

Celebrities with Herpes

STD Carriers has always done well for searches about Celebrities with STDs including Herpes. The Celebrities with STDs section of the site (https://stdcarriers.com/famouspeople/celebrities.aspx) has been the largest list of celebrities with STDs for over a decade. The site is typically in the top 5 for "Celebrities with STDs" and is still on the first page despite the update, so why would it suddenly find itself on the second page for "Celebrities with Herpes"?

We think it might have something to do with the same performance problems impacting Cop Blaster, but less so because we have not seen the same flags for poor LCP from Google for STD Carriers as we have for Cop Blaster. Both sites run as separate applications on the server so if Cop Blaster freezes the other sites still run. STD Carriers has thousands of results but not nearly as much data as Cop Blaster does, so we think that although a performance boost will help that it is not the main issue.

Unlike the Snitch List on Cop Blaster, none of the content in this archive is user generated, so it does not have any reliability issues, but the site itself is mostly user generated reports about other people (https://stdcarriers.com/registry/search.aspx). For that reason we think that the other parts of the site might be suffering due to poor user generated content in that section. Again, the benefit of the user generated content far outweighs doing poorly on queries about celebrities, so getting rid of that is not an option. Our best hope is that Google can distinguish between good and poor content on the same site.

STD Carriers was hit really hard last April/May by the updates punishing sites about health care that do not appear to have any medical expertise, but it appears to have recovered on queries looking of other types of content on the site that does not require medical expertise. Celebrity gossip is not something you need medical expertise to do media research about. We also recovered our number one spot for our brand "STD Carriers" which we had been pushed to page 4 on. We also lost our number one result for "STD Registry" but are still on the first page. Since no doctors are legally allowed to create such a registry and the government has no such public registry we think that is why we are able to still do well for that one. That and the fact that most other first page results including the top result link to it.

Conclusion

We have reached the conclusion that performance is most likely the reason for these problems, but that there are other issues impacting the sites as well. Whenever Google elects to rank a Facebook post containing an article linking to your site over that same article on your site they are probably saying that at least the Facebook page won't take so long to load. That explains how sites whose backlinks have improved could do worse in SERPs. We think there is a problem with how Google treats category archives. Such pages are typically heavy on images and links but think on text content. We think we may need to include more text on those page if we are going to overcome these problems, but even that would lead to more duplicate content issues with the articles themselves and we don't have time make changes like that. We will post updates as we can.

Search Buzz Video Recap

Login to Comment and Rate

Already a PostAlmostAnything.com member?Login Here

Register to Comment and Rate

If you are not yet a PostAlmostAnything.com member Sign Up Here.


By
Posted on 12/08/2020

In response to some feedback on Reddit I've set the height of the slideshow div tag to 500px and the max height of the images to that same value because people were rightfully complaining about having trouble reading the article. I apologize but like the site says "under development" and this slideshow feature as well as allowing multiple images with each post are things new to the new site. Only about 5-10 posts use the slideshow feature so far, so I figured focusing on other stuff was more important. Glad people pointed it out though, thank you I don't want people having trouble reading anything.


By
Posted on 12/08/2020

I analyzed CopBlaster.com using Sistrix.com and they said that the following errors were found. This indicates my theory as to performance being the problem was probably right. Over 800 internal server errors indicate that there was probably insufficient CPU and RAM for SQL Server to handle the amount of data/queries to serve all the the site visitors when Sistrix crawled it.

I'm not sure if I can make Sistrix recrawl the site before its next scheduled crawl which is on the 15th, but today I added 3 more CPU cores and one more GB of RAM to the VPS hosting the site. I am hoping that will do the trick because if not I'll have to upgrade to the host's most expensive VPS plan which offers and 10 GB of RAM. The next step after that would be a dedicated server. I am hoping that since the Windows Server was only running out of available CPU and RAM a few times a day that this will do the trick.

Sistrix Crawl Results
Sistrix Crawl Results

By
Posted on 12/08/2020

The MOZ data for the sites are as follows:

CopBlaster.com

Domain Authority: 26

Linking Root Domains: 529

Ranking Keywords: 1.9k

Spam Score: 5%

STDCarriers.com

Domain Authority: 32

Linking Root Domains: 282

Ranking Keywords: 182

Spam Score: 15%

I don't see anything there that would indicate a drop. Sure the DA score are not very high, but they at least have a few good links pointing at them. Enough to indicate that the content is ok.