Google on Effect of Low Quality Pages on Sitewide Rankings


In a Google Webmaster Hangout, somebody requested if poor high quality pages of a website might drag down the rankings of the whole website. Google’s John Mueller’s reply gave insights into how Google judges and ranks net pages and websites.

Do a Few Pages Drag Down the Entire Site?

The query requested if a bit of a website might drag down the remaining of the positioning.

The query:

“I’m curious if content is judged on a page level per the keyword or the site as a whole. Only a sub-section of the site is buying guides and they’re all under their specific URL structure.

Would Google penalize everything under that URL holistically? Do a few bad apples drag down the average?”

Difference Between Not Ranking and Penalization

John Mueller began off by correcting a notion about getting penalized that was inherent within the query. Web publishers typically complain about being penalized when in actual fact they don’t seem to be. What’s occurring is that their web page just isn’t rating.

There is a distinction between Google taking a look at your web page and deciding to not rank it.

When a web page fails to rank, it’s typically as a result of the content material just isn’t ok (a high quality situation) or the content material just isn’t related to the search question (relevance being to the person). That’s a failure to rank, not a penalization.

A typical instance is the so-called Duplicate Content Penalty. There isn’t any such penalty. It’s an lack of ability to rank brought on by content material high quality.

Another instance is the Content Cannibalization Penalty, which is one other so-called penalty that isn’t a penalty.

Both relate to an lack of ability to rank as a result of of particular content material points, however they don’t seem to be penalties.  The options to each contain figuring out the trigger and fixing it, identical to another failure to rank situation.

A penalty is one thing utterly totally different in that it’s a consequence of a blatant violation of Google’s pointers.

John Mueller Defines a Penalty

Google’s Mueller started his reply by first defining what a penalty is:

“Usually the word penalty is associated with manual actions. And if there were a manual action, like if someone manually looked at your website and said this is not a good website then you would have a notification in Search console.

So I suspect that’s not the case…”

How Google Defines Page-Level Quality

Google’s John Mueller appeared to say that Google tries to focus on web page high quality as an alternative of general website high quality, in terms of rating. But he additionally mentioned this isn’t potential with each web site.

Here is what John mentioned:

“In general when it comes to quality of a website we try to be as fine grained as possible to figure out which specific pages or parts of the website are seen as being really good and which parts are kind of maybe not so good.

And depending on the website, sometimes that’s possible. Sometimes that’s not possible. We just have to look at everything overall.”

Why Do Some Sites Get Away with Low Quality Pages?

John’s reply is attention-grabbing. But it additionally results in one other query. Why do some websites get away with low high quality sections whereas others can’t?

I think, and that is only a guess, that it might be a matter of the density of the low high quality noise throughout the website.

For instance, a website is perhaps comprised of top quality net pages however characteristic a bit that accommodates skinny content material. In that case, as a result of the skinny content material is only a single part, it may not intervene with the flexibility of the pages on the remaining of the positioning from rating.

In a special situation, if a website largely accommodates low high quality net pages, the great high quality pages could have a tough time gaining traction by way of inner linking and the movement of PageRank by way of the positioning. The low high quality pages might theoretically hinder a top quality web page’s means to amass the alerts essential for Google to know the web page.

Here is the place John described a website which may be unable to rank a top quality web page as a result of Google couldn’t get previous all of the low high quality alerts.

Here’s what John mentioned:

“So it might be that we found a part of your website where we say we’re not so sure about the quality of this part of the website because there’s some really good stuff here. But there’s also some really shady or iffy stuff here as well… and we don’t know like how we should treat things over all. That might be the case.”

Effect of Low Quality Signals Sitewide

John Mueller supplied an attention-grabbing perception into how low high quality on-page alerts might intervene with the flexibility of top quality pages to rank. Of equal curiosity he additionally instructed that in some instances the unfavorable alerts may not intervene with the flexibility of top quality pages to rank.

So if I have been to place an thought from this trade and put it in a bag to remove with me, I’d choose the concept a website with largely low high quality content material goes to have a more durable time attempting to rank a top quality web page.

And equally, a website with largely top quality content material goes to have the ability to rise above some low high quality content material that is perhaps separated into it’s personal little part. It is of course a good suggestion to reduce low high quality alerts as a lot as you may.

Watch the Webmaster Hangout right here.

More Resources

Screenshots by Author, Modified by Author



Tags: , , , , ,