[edit – I must be psychic – the same day that I wrote this post the Google Webmaster Central Blog announced
Supplemental goes mainstream. Seems they don’t agree with my reasoning and are determined to make it harder for us. Only time will tell if the results from the two indexes really do come together, but certainly the comments on that announcement seem pretty skeptical.]
There’s a lot of discussion at the moment about Google’s removal of a search method (though it looked more like a hack) which was supposed to allow you to check which of your pages were in the supplemental index rather than the main one.
It should be said that there was always some doubt about this method – it seems to have thrown up inconsistent results and there was a suggestion that it was possible to appear in both indexes at the same time so the results could be misleading. No wonder supplementals are a source of confusion.
At various times we’ve had Matt Cutts suggesting that we’re all too hung up on supplementals and even that Google might remove the supplemental indicator from the site: command results. [edit – which is what they’ve now done] The latter seems like a very retrograde step as it’ll leave people even more in the dark than ever about what Google thinks of their pages, and it will simply fuel rumours of the sort that cause so many problems for those without the experience to follow good practice or the money to employ professional help, rumours which Matt himself will then have to debunk.
So are we all hung up on supplementals? Well there seems little doubt that a supplemental page will not rank in a competitive field. I have recently seen a revamped site which had poor connection to its detail pages. To rememedy this an intermediate page was built which contained a great deal of information and linked to subpages which again had lots of good quality information around which the deepest detail pages were clustered. When this intermediate page first went live it immediately started to rank for a number of terms, and the subpages also started to rank. However a few weeks later the intermediate page dropped into the supplemental index, as did the subpages; the rankings immediately vanished, and have not returned.
Such a situation seems to lack natural justice and is likely to cause confusion amongst reputable webmasters who are trying to produce good sites – which is what Google claim to want. Since we don’t know exactly why pages can go supplemental some people will speculate and be tempted to take actions that may not be to the benefit of their users. Cutts has hinted that one of the reasons is lack of PageRank. That sounds dangerously like an invitation to point lots of links at the pages concerned as well as saying that quality is secondary to link strength. And then they wonder why people go out and buy links!
We can also look at the other end of the problem – what if we allow for the moment that there are perfectly good reasons in the algorithm for downgrading a page, and let’s say that a particular page isn’t good enough, isn’t worth ranking in the main index. Shouldn’t a conscientious webmaster be given a method by which they can check which pages of their site are regarded as poor, so they can take action to improve them?
Come on Google, give us the information we need to improve our sites, and then reward those improvements accordingly. Happier webmasters, better sites, better quality search results. Isn’t that what we all want?