Thursday, April 27, 2006

Bigdaddy dropping pages...

Some sites are nearly being dropped from the index altogether - some, however, are slowly building up again. There are people who have experienced a drop from 11,000 pages indexed down to just 2 pages the past week, another site reported a drop from 500,000 down to 44,300 pages with a sharp drop in traffic as a result. The key objective appears to be to remove spam sites and duplicated content - but that is yet to be confirmed.

One thing to bear in mind is that Bigdaddy will have different crawl priorities. That may account for some of it. If you have had any spam problems in the past, you might also want to do a reinclusion request. Spammers using the Google Sitemap to feed in pages also now need to be aware.

Matt Cutts confirmed at Boston PubCon that Google with radically change the process used to spider pages. Caching will be implemented on the spider and thus reducing the requests that Googlebot will be required to make of a page.

If you want to help test-drive the new results, you can do so by searching at 66.249.93.104. If you’re unhappy with what you find, you can use the “Dissatisfied?” link at the bottom right to send feedback, or file a spam report (include the word “Bigdaddy” in your additional details).

What is Bigdaddy?
Google has a
new data center nicknamed Bigdaddy, as Googler Matt Cutts writes back in January. Matt expects that Bigdaddy’s results, which are still experimental at this stage, will become the default for Google later on (possibly in a few months). Matt says, “[Bigdaddy] has some new infrastructure, not just better algorithms or different data. Most of the changes are under the hood, enough so that an average user might not even notice any difference in this iteration.”


0 Comments:

Post a Comment

<< Home