Google dumping data?

I’ve mentioned a few times recently that I was seeing some pretty inconsistent search positions on a number of my sites. There now appears to be a potential explanation. Following a post from Michael Martinez of SEO Theory, a couple of the SEO forums are running threads which suggest that Google may be dumping data in a way that we’ve seen them do before, and that as a result some sites are seeing lower numbers of indexed pages.

This coincided with my seeing a drop in indexed pages on a couple of the sites I monitor and also the strange appearance of a large number of erroneous link reports in the Google Webmaster Tools reports of another site. This site is a bookshop but was being reported as appearing in high positions for a number of totally unrelated topics. Looking at the external links report showed that there was a big increase in the number of links and that their origins and link text tallied with the strange search terms. Needless to say the pages concerned had no links pointing at my this site nor would they ever have any reason to.

What I now suspect, though obviously this can only be conjecture at this stage, is that somewhere along the line the indexes have become corrupted and Google is having to rebuild them. We saw something similar a couple of years ago when they brought a load of new data centres online and something appeared to go wrong. It’s also possible that this may have something to do with the dreaded Supplementals index – although that is pure speculation.

Whatever the reason the only thing to do is sit tight and wait for it all to blow over in a few weeks and then see what the search results look like then. Keep on producing new content as that appears to be being incorporated as normal, but don’t do anything drastic with the old content – it will probably come back into the indexes in good time.

Comments are closed.