Are exchanged or reciprocal links okay with Google?
Etmagnis dis parturient montes, nascetur ridiculus mus. Donec lorem ipsum dolor sit amet, et consectetuer adipiscing elit. Aenean commodo ligula eget consyect etur dolor.

Contact Info


121 King St, Melbourne VIC 3000, Australia

Folow us on social

It is normal for 20% of a site not to be indexed

It is normal for 20% of a site not to be indexed

Google’s John Mueller answered a question about indexing and provided insight into how overall site quality affects indexing patterns. He also offered the insight that it is within the normal range that 20% of a site’s content is not indexed.

Pages detected but not crawled

The person who asked the question provided background information about their site.

Of particular concern was the stated fact that the server was overloaded and if it could affect how many pages Google indexed.

When a server is overloaded, the request for a web page may result in a 500 error response. This is because when a server is unable to operate a web page, the default response is a 500 internal server error message.

The person who asked the question did not mention that the Google Search Console reported that Googlebot received 500 error response codes.

So if it is the case that Googlebot did not receive a 500 error response, then the problem of server overload is probably not the reason why 20% of the pages are not indexed.


Continue reading below

The person asked the following questions:

“20% of my pages are not indexed.

It says they have been discovered but not searched.

Does this have something to do with it not being crawled due to potential congestion on my server?

Or does it have to do with the quality of the page? ”

Do not review budget in general, which is why small sites have pages that are not indexed

Google’s John Mueller offered an interesting explanation of how overall site quality is an important factor in determining whether Googlebot will index multiple web pages.

But first, he discussed how the crawl budget is not usually a reason pages remain unindexed for a small site.

John Mueller replied:

“Probably a bit of both.

So normally, if we’re talking about a smaller site, it’s mostly not a matter of us being limited by the crawl capacity, which is the crawl budget side of things.

If we’re talking about a site that has millions of pages, then this is something I would consider looking at the crawl budget page of things.

But smaller sites are probably smaller. ”


Continue reading below

Overall site quality determines indexing

John then went into detail about how overall site quality can affect how much of a site is crawled and indexed.

This part is particularly interesting because it gives a look at how Google rates a site in terms of quality and how the overall impression affects indexing.

Mueller continued his response:

“In terms of quality, when it comes to understanding the quality of the site, it’s something we take quite strongly into account when it comes to crawling and indexing the rest of the site.

But it is not something that is necessarily related to the individual URL.

So if you have five pages that are not currently indexed, it is not that these five pages are the ones we would consider low quality.

It’s more that … overall, we might consider this site to be a little bit lower quality. And therefore we will not go out and index everything on this site.

Because if we do not have that page indexed, then we do not really know if it is high quality or low quality.

So this is the direction I would go there … if you have a smaller site and you see that a significant portion of your pages are not indexed, then I would take a step back and try to reconsider the overall quality of site and do not focus so much on technical issues for these pages. ”

Technical factors and indexing

Mueller then mentions technical factors and how easy it is for modern sites to get that part right so that it does not get in the way of indexing.

Mueller observed:

“Because I mostly think websites today are technically reasonable.

If you use a common CMS, it’s really hard to do something really wrong.

And it’s often more a matter of overall quality. ”

It is normal for 20% of a site not to be indexed

This next part is also interesting, as Mueller downplays 20% of a site that is not indexed as something that is within the normal range.

Mueller has more access to information on how many sites are typically not indexed, so I take him at his word because he speaks from Google’s perspective.

Mueller explains why it is normal for pages not to be indexed:

“The other thing to keep in mind when it comes to indexing is that it is quite normal that we do not index everything from the site.

So if you look at a larger site or even a medium-sized or smaller site, you will see fluctuations in indexing.

It will go up and down, and it will never be the case that we index 100% of everything that is on a site.

So if you have a hundred pages and (I do not know) 80 of them are indexed, then I would not see it as a problem you have to solve.

This is how it is sometimes just at the moment.

And over time, when you like 200 pages on your site and we index 180 of them, that percentage gets a little smaller.

But it will always be the case that we do not index 100% of everything we know about. ”


Continue reading below

Do not panic if pages are not indexed

There is quite a lot of information Mueller shared about indexing that needs to be put to use.

It is normal for 20% of a site not to be indexed. Technical issues are unlikely to hamper indexing. Overall site quality can determine how much a site gets indexed. How much of a site gets indexed fluctuates. generally do not have to worry about crawl budget.


It is normal for 20% of a site not to be indexed
Watch Mueller discuss what is normal indexing from the approximately 27:26 minute mark.

    Leave Your Comment

    Your email address will not be published.*