Traffic Google would send and thus
Posted: Sun Dec 22, 2024 8:27 am
Abuse was born. As an example, try searching using Google's "last 24 hours" function: SEOmoz blog post search on Google in the past 24 hours Seriously, go have a look; the quantity of "junk" you wouldn't want in your search engine's index is remarkable Since Tom published the post on Xenu's Link Sleuth last night, Google's already discovered more than 250 pages around the web that include that content or mentions of it.
If, according to Technorati, the blogosphere is israel email list still producing 1.5 million+ posts each week, that's conservatively growing the web by ~20 billion pages each year. It should come as no surprise that Google, along with every other search engine, has absolutely no desire to keep more than, possibly, 10-20% of this type of content (and anyone who's tried re-publishing in this fashion for SEO has likely felt that effect). Claiming to have the biggest index size may actually be a strike against relevancy in this world (according to Danny Sullivan, it's been a dead metric for a long time).

So - long story short - Google (very likely) has a limit it places on the number of URLs it will keep in its main index and potentially return in the search results for domains. The interesting part is that, in the past 3 months, the number of big websites (I'll use that to refer to sites with an excess of 1 million unique pages) we've talked to, helped through Q+A or consulted with that have lost wide swaths of indexation has skyrocketed, and we're not alone.
If, according to Technorati, the blogosphere is israel email list still producing 1.5 million+ posts each week, that's conservatively growing the web by ~20 billion pages each year. It should come as no surprise that Google, along with every other search engine, has absolutely no desire to keep more than, possibly, 10-20% of this type of content (and anyone who's tried re-publishing in this fashion for SEO has likely felt that effect). Claiming to have the biggest index size may actually be a strike against relevancy in this world (according to Danny Sullivan, it's been a dead metric for a long time).

So - long story short - Google (very likely) has a limit it places on the number of URLs it will keep in its main index and potentially return in the search results for domains. The interesting part is that, in the past 3 months, the number of big websites (I'll use that to refer to sites with an excess of 1 million unique pages) we've talked to, helped through Q+A or consulted with that have lost wide swaths of indexation has skyrocketed, and we're not alone.