Documenting Google’s website rating and its impact on search rankings using the Gabeback Machine (Googler quotes, videos, documentation and more)

Documenting Google’s website rating and its impact on search rankings using the Gabeback Machine (Googler quotes, videos, documentation and more)

It’s not like all URLs on a website suddenly become poor quality overnight. It is a site-level adjustment that can affect the ranking of an entire site.

After recent creator summitSome participants said that Pandu Nayak from Google joined the group to answer questions about the helpful content update, Google’s quality rating, the current status of search results and more. One comment from that meeting was triggering for the site owners and the broader SEO community. When asked about the HCU site-level classifier and its impact on site-level rankings, Pandu replied that there is none and that Google handles rankings at the page level.

After reading these comments my head almost exploded for several reasons. Firstly, this is definitely not true, and Google itself has explained how site-level rankings are affected many, many times over the years. Don’t worry, I’ll get back to it soon. Second, the update that most of the group was discussing (the helpful content update) was literally designed to apply the following: Location-level classifier if a website was considered to contain a significant amount of content written for searches versus humans. It was literally in the announcement and documentation until Google integrated the update into its core ranking system. I will also report more about the HCU soon.

Google hasn’t been shy about explaining that it has site-level quality algorithms that can have a big impact on rankings, especially with major algorithm updates like major core updates. Additionally, Google has developed several algorithm updates over the years that specifically impact the website level. For example, the medieval panda, the penguin, the pirate and of course the helpful content update I just explained. This update broadly affected websites (when applying a classifier, websites were negatively affected).

So when I heard from a senior manager at Google (VP of Search) that everything at the page level wasn’t going over well (both with the meeting attendees and the SEO community at large). It seemed like gaslighting, and that’s not true either, based on everything I’ve heard from Google employees over the years, what Google has explained about important algorithm updates, and what I’ve seen while helping many companies do so to deal with major algorithm update failures over the years. I’ll also go back to Pandu’s statement at the end of this post about the impact of page-level rating on page-level rating, and what I think Pandu might have meant by his statement.

Documentation of site-level impacts over the years: This is where the “gabeback machine” comes into play.
I would like to officially welcome you to the G-Squared archives. Over the years, I’ve documented many statements from Google employees about major algorithm updates, how those updates work, how rankings can be affected by different systems, algorithms, manual actions, and more. This information was extremely helpful because what Google employees say is important. And Google’s documentation is important. I like to call my archive “The Gabeback Machine” because it’s like Wayback Machine but very focused on SEO.

Below I’ll cover a number of key points about site-level assessment and how it impacts site-level rankings. After reviewing the information, you can make your own decisions about whether there are site-level impacts. I think you’ll get the hang of it quickly and maybe even learn a thing or two based on what I’ve documented over the years.

Let’s get started.

Major algorithm updates with widespread impact (site level):
Google has developed some powerful algorithm updates outside of its core ranking system over the years. Actually, I just covered this topic in my NESS presentation. If Google detects a gap or problem that its core ranking system isn’t fixing, it can create additional algorithm updates and systems in a “lab.” After intensive testing, Google may release these updates to address these specific issues.

Sometimes these additional algorithm updates work well, and sometimes they go off the rails. Regardless, they have had site-level impacts in the past. For example, the medieval panda, the original penguin, the pirate (who still roams the high SEO seas), and then the latest algorithm update to hit the market – the infamous “Helpful Content Update” (HCU).

Google creates major algorithm updates outside of comprehensive core updates.
Google engineers are working on important algorithm updates.
Google is intensively testing extensive algorithm updates.
What can sometimes happen when Google creates major algorithm updates outside of core updates?

When these updates were rolled out, affected websites often faltered. They were punitive algorithm updates (intended to punish, not encourage). That’s because a classifier was applied to websites that did exactly what those algorithms were targeting. You can think of this as a site-wide ranking signal that can pull down rankings across the site. Not everyone drops the same way with every query, but it’s like walking through heavy mud and every step forward is hard… Basically, there is a filter that is applied to your website’s ranking. And “site” is often at the hostname (i.e. subdomain) level. I’ll share more about it soon.

It’s not like every URL in a day is poor quality…
If you are affected by any of the updates I mentioned, you can easily see that the site visibility is decreasing significantly over time. These websites often crashed in a day. As far as site and page level impact goes, it’s not like every URL on the site is suddenly poor quality… That’s ridiculous. It’s just that the classifier was applied and the rankings increased sharply across the site.

For example, here are some websites affected by the helpful content update:

Example of a website severely impacted by the September 2023 Helpful Content Update (HCUX).
Another example of a website heavily impacted by the September 2023 Helpful Content Update (HCUX).

Does this look like a page-level impact to you?

Appendix 1: HCU
And here is evidence in my case that Google has site-level classifiers. First, I was one of the few SEOs that Google contacted in the summer of 2022, before the original helpful content update was released. Google wanted to explain what they had created and give some of us a heads up since we were heavily focused on the topic of algorithm updates. During that call, documented in my post about the first helpful content update, Google explained it was a Location-level classifier This could impact rankings across the site.

The HCU classifier at the site level

And of course the site level part was also documented in their own posts and documentation for the helpful content update.

Google's documentation on the helpful site-level content update classifier.

To be clear: GOOGLE ITSELF DOCUMENTS THAT THE HCU WOULD HAVE SITE-LEVEL IMPACT.

Beyond the HCU, Panda, Penguin, and Pirate also had site-level impacts. By the way, as far as we know, Pirate is still running and updated often. When a website was affected by Panda, Penguin, or Pirate, rankings dropped across the site. Again it was like a filter had been applied. It was not a page-level downgrade. It was a site level based on a classifier applied to the site.

Appendix 2: Google’s Penguin Algorithm.
Google Penguin was founded in 2012 to target websites that generate a lot of spam links to game PageRank. This algorithm update was also a site-level adjustment. Many affected websites plummeted in rankings as soon as the update was released. It was an extremely punitive algorithm, just like HCU and Panda.

Here is a quote from the launch announcement. Note the wording about decreasing rankings for Websites (and not pages):

“In the next few days we are launching an important algorithm change targeting webspam. The change will lower the rankings of websites that we believe are violating Google rankings Quality guidelines. This algorithm represents another step in our efforts to reduce web spam and promote quality content.”

“Sites affected by this change may not be easy to spot as spam without thorough analysis or expertise, but the commonality is that these sites do much more than just ethical SEO; We believe they are using webspam tactics to manipulate search engine rankings.”

Appendix 3: Medieval Panda Algorithm:
Launched in 2011, Panda targeted websites with a significant amount of low-quality and/or thin content. Affected websites often saw their rankings plummet after the update was introduced. A Panda Score was calculated and sites on the wrong side of the algorithm would pay a heavy price. It was like a giant negative filter applied to the website. The good news was that Panda was run frequently (typically every 6-8 weeks). And sites actually had a chance to recover… a lot of them.

Here is a quote from the Launch announcement. Pay attention to the language here too Websites and not pages:

“This update is intended to reduce the ranking of low-quality websites – sites that provide little value to users, copy content from other sites, or sites that are simply not very useful. At the same time, it will enable better rankings for high-quality websites – websites with original content and information such as research, in-depth reports, thoughtful analysis, etc.

“That’s why it’s important that high-quality sites are rewarded, and that’s exactly what this change does.”

“Based on our testing, we found that the algorithm is very accurate at detecting the quality of the website. If you believe that your website is of high quality and is affected by this change, we encourage you to evaluate the various aspects of your website in detail.”

Appendix 4: Google’s Pirate Algorithm.
Yes, buddy! Many people don’t think about Google’s Pirate Algorithm, but it can also have a big impact on rankings on a website (for those that receive a lot of DMCA takedowns). I wrote a post about my analysis of the algorithm update in 2013, and like Panda, Penguin, and the HCU, websites often took a big hit when affected by Pirate.

Here is a quote from the announcement about pirate. Yes, Websites and not pages:
“Sites with a high number of takedown notices may appear lower in our results.”

A quick note about site and hostname for major algorithm updates:
By the way, Google’s website-level quality algorithms are usually applied at the hostname level. For this reason, it is common for a website with multiple subdomains to experience different impacts with major algorithm updates. One subdomain could decline sharply while the others surge or remain stable.

And it all fits in nicely with a whole bunch of quotes, links, and videos about site-level assessment and impact (based on quotes from Google employees and information in Google’s documentation). Again, I tapped into the “gabeback machine” to find this important information.

So strap on your SEO helmets. There is a lot to discuss.

Exhibit X: Information from Google about impact at the site level: documentation, links, tweets and videos.
I document everything here too. It’s a matter of mine. From time-stamped webmaster hangout videos to key tweets and social shares to quotes from conference Q&As, I think it’s important to document Google statements about algorithm updates, systems, rankings and more.

In my opinion, this is incredibly important for website owners to understand Site level So I have documented all of this very carefully over the years. I also shared a lot of this in my posts and presentations about important algorithm updates and also shared it on social media. And at the end, Barry also reported on some of my tweets about the impact at the site level. With all that said, I wanted to collect more of these quotes, tweets and videos in one place (especially based on Pandu Nayak’s recent comments). Hence this post!

Below I present everything about the site-level assessment and impact in no particular order:

Google’s John Mueller explains that quality matters Site level signal.
“We index things page by page and rank things page by page, but there are some signals that we can’t reliably capture on a per page basis, for which we need a little better understanding of the entire site. And quality kind of falls into that category.”

YouTube video

A Google patent that explains that Google ranks and applies search results on pages using page-level scores Location level results based on things like quality.

Handle ALL quality issues -> Via John Mueller:
Do you have a lot of older, low-quality sites? Yes, this can hurt your website in search. Google is looking at this website as a whole, So if you see a lot of low-quality content, Google may take this into account when ranking:

YouTube video

More from John Mueller on Site level impacts:
“Of course we rank pages, there is the content. We also have other signals that are on a broader level – There is no conspiracy. Panda, Penguin, even basic things like geotargeting, safe search, Search Console settings, etc.”

Quotes from Paul Haahr, a leading Google ranking engineer: Article written by Danny Sullivan (not yet on Google at the time) with quotes from Paul:

“If everything is the same with two different sides, site-wide signals can help individual sites.”
“Imagine two articles on the same topic, one in the Wall Street Journal and another on a volatile domain. Since there is absolutely no other information, the Wall Street Journal article looks better given the information we have now. That would mean disseminating information from the domain at the page level,” Haahr said.

And another quote from the interview:
“That doesn’t mean Google hasn’t done that site-wide signals This in turn can have an impact on individual pages. How fast a website is or whether a website is affected by malware are two things that can affect the pages of these websites. Or in the past, Google’s “Penguin Update,” which targeted spam site-wide basis

Google March 2024 Core Update FAQ:
In Google’s own documentation too the comprehensive March 2024 core updateThere is an FAQ titled “What ranking signals does a website have?” In this answer, Google explains that some site-wide signals are taken into account. Again, Google’s docs are all about comprehensive core updates.

Documenting Google's website rating and its impact on search rankings using the Gabeback Machine (Googler quotes, videos, documentation and more)

About John Mueller: “When assessing relevance, Google tries to look at the overall picture of the website.” And “Some of the things that Google evaluates are focused on that Domain level.

“Yes, sometimes we look at the site as a whole to see how it fits with the rest of the web and to see when it would be relevant to show in search results…”

“We try to evaluate the quality of one Site as a whole as an initial understanding of how good the website is. Does it have relevant, unique and compelling content that we can show in search results?”

With core updates, things focus more on the site as a whole, i.e. the big picture.

YouTube video

“Some things we pay attention to are the overall quality of the website. If you have significant parts that are of lower quality, we may think that the site is not great overall. And this can have an impact in different parts of the website. Lower quality content may detract from the higher quality content on the site.”

YouTube video

As for the product review update, low-quality content may affect the high-quality content. Here was my tweet: Are you affected by the product reviews update and have a mix of high quality and low quality content? About John Mueller: “A mix like that can mess things up. Google looks at a site from an overall quality perspective, so the low can drag down the high.”:

YouTube video

John Mueller on core updates. It’s not about individual problems, but about the quality of the website as a whole.
“Google doesn’t focus on individual issues, but on the overall relevance of the site (including content quality, UX issues, on-page ads, how they are presented, sources, and more):
When you’re working on improvements, don’t just focus on improving everything around the content, you should also work on improving the content. Ie: Where is there low-quality content, where are users confused, etc.? Then speak up.”

YouTube video

How does a quality rating work after removing low-quality content? -> About John Mueller: “It can take months (6+ months) for Google to re-rank a site after the overall quality has improved. This is partly due to re-indexing and partly due to collecting quality signals. Additionally, testing for major core updates is very difficult because a small subset of pages is not enough for Google to classify a site as higher quality.”

“A lot depends on whether a URL is indexed or not. I wouldn’t…overstate quality, and I wouldn’t assume that general site quality improvements are simple technical tweaks. In addition, a significant improvement in the quality of a website has an impact on all areas.”

From one Article on the Search Engine Roundtable. Here is John’s tweet:

John Mueller's tweet about impact on site-level rankings.

Now the great Bill Slawski reports on patents for how Google calculates website quality scores.

Here’s how to lose or gain rich snippets based on an overall site quality score:
At 3:45 in the podcast (and something I’ve documented multiple times in my blog posts). If rich snippets are lost across the site, major quality issues can arise. There are several signals that Google uses throughout the site that can ultimately impact things like rich snippets. Caution. And there was a Panda score that could affect the overall quality of your website.

More from Search Engine Roundtable on losing rich snippets due to overall quality:

And from Barry at Search Engine Roundtable on losing rich snippets based on site-level quality scores:

And I reported more about it on Twitter:
Site-level quality signal -> If you don’t see rich results in the SERPs and they’re set up correctly from a technical perspective, it’s usually a sign that Google’s quality algorithms aren’t 100% happy with the site. I.e. Can Google trust the website enough to show rich results?…

John on improving a page and how that can affect the ranking of that one page.
However, there are still signals that Google needs to pay attention to Site level.
Quality is one of these signals. Improve the website overall and over time.

YouTube video

Google about low quality UGC in some parts of a site that affects the entire site:

Google’s documentation explains this: https://developers.google.com/search/blog/2021/05/prevent-portions-of-site-from-spam

Google searches on authority and trust:
In the proposal, the Google researchers address authority and trust. Yes, Domain level: “There are many well-known techniques for assessing the authority or veracity of a webpage, from fact-checking claims within a single page to aggregating quality signals on the logical page.” Domain level.

The same applies to search engines: If Google can determine that a website is suitable for the broader topic, it can also show that website for broader searches on that topic. Google doesn’t need to focus on individual pages to make the site visible for this broader topic.

YouTube video

Google’s quality algorithms impact Discover visibility (broadly speaking):

YouTube video

About John Mueller:
Yes, low-quality pages on a website can impact the overall “authority” of that website. A few won’t kill you:

How does Google detect quality changes and how long does it take for the effects to be visible? About John Mueller:
Many of our quality algorithms do not have a specific time frame. It could take months as Google needs to see changes throughout the website (as the entire website is processed again).

YouTube video

John on an overall overview of the site in terms of expertise:
About John Mueller: Re: The March core update, these are quality updates that we release. On the one hand, technical SEO problems are always smart to fix, but if the General view of the website makes it hard to say you’re an expert, then that may also play a role.

YouTube video

If a new site is a high-quality site that we trust, we may rank it higher from the start.

Another look at Pandu’s comments on page-level versus site-level impact.
Before I end this post, I would like to review Pandu’s comments on the page and site level impact on rankings. Perhaps the confusion is simply due to semantics. For example, Google’s page-level ranking may take into account the site-level rating. In the examples I provided above, this was mentioned several times (e.g. by Paul Haahr from Google). If that’s the case, then it could be Pandu technically True about page level impact, but filters based on that could be applied Location level assessment.

For example, think of a Panda score, HCU classifier, Pirate classifier, site-level quality score for major core updates, etc. that affect the page-level score. I’m just thinking out loud, but the idea of ​​just doing a page-level rating without considering the site-level rating is crazy to me. And everything I’ve documented over the years and what I’ve seen first-hand while helping many companies navigate major algorithm updates shows me that site-level assessment and impact is certainly underway .

Again, it’s not like all URLs on a website suddenly became poor quality overnight. It is a site-level score that can move an entire site’s ranking down or up. Yes, the site-level impact is real.

GG

Want Latest Updates in Your Inbox?

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top