Google Webmaster Tools problem

And the wider issue of communicating with the search giant

For most of the last month or so there has been a problem for some sites in accessing the useful tools that Google makes available for site administrators. This manifested itself as a failure to verify the “ownership” of the sites using either of the two methods available, with an error message that varied between indicating a server timeout and a DNS error in looking up the site.

Unfortunately this wasn’t acknowledged as a problem until three weeks after it first started and at time of writing it still hasn’t been resolved on many of my own sites amongst many others. As a result many webmasters have been wasting time trying to solve non-existent problems with their sites and making pointless support calls to their hosting companies. At least in the last week there has started to be some individual responses from official Google staff to some of the postings on the relevant Google groups forum and this is a welcome development but it serves to highlight the fact that they are generally a very unresponsive company and getting hard facts out of them is extremely difficult.

Now to some degree I have some sympathy with their dilemma because if there were totally open channels of communication then they would be deluged with millions of queries and complaints, many of them half-baked or misinformed at best – we’ve all seen the nutters and chancers who complain bitterly about dropping rankings when their sites are riddled with blatant black-hat techniques and spam. However a way has to be found to allow genuine webmasters to report real problems.

With any system of the mind-boggling complexity of a global search engine there will inevitably be problems and bugs. But by not engaging with the webmaster community Google are missing a perfect opportunity to get exactly the sort of feedback that they need from people in a position to see the effects and give them early warnings of possible errors. No matter how good Google’s engineers are they aren’t looking at search results in the same intensive way that we are. Sometimes we’ll see puzzling inconsistencies in data that will ring bells for us, or we’ll see patterns when analysing SERPS results over an extended period. You can develop a sixth sense for when things are not quite right and this could be invaluable to them in tracing problems.

Remember the Big Daddy update? For months webmasters were baffled by perfectly good sites losing all ranking; of course there was a lot of noise from the less reputable as well but it was easy to tell that there were plenty of genuine people suffering. For quite some time the official line was that there was no problem and people should just clean up their sites and add more content. Of course many desperate webmasters ended up making major changes to try to get back some rankings and traffic to help their businesses survive. I myself lost a swathe of high rankings for things that I was clearly one of the most relevant sites for – not just dropped down a bit but dropped out of the index altogether – but was fortunately able to sit it out making no changes. A good while later we started seeing a particular datacentre with rankings that looked a lot like they should be, and then a few weeks after that all the datacentres had that data rolled out and all my top rankings returned. With better communications all that wasted effort, lost business, and vast quantity of forum chatter could have been avoided and maybe Google could have got enough useful feedback to roll out the corrected update a bit sooner. And they wouldn’t have lost so many friends and suffered such bad PR.

The development of Webmaster Tools was a great step forward but I’ve seen a number of oddities in it at times. For instance one client’s site was (quite naturally) largely based around two keywords yet one of them wasn’t listed in the “How does Googlebot see your site” section. This seemed bizarre since the same term was prominently listed in the link text pointing at the site, but it did raise suspicions about an apparent penalty they seemed to be suffering from when we took over their account. We tried emailing Google about it but received no response. That could have been an opportunity for a useful dialogue that would have helped us to ensure their site was clean and of good quality by knowing where to look.

Other issues come to mind. I regularly see a set of results coming round that is pretty obviously broken data – a range of ranking terms all drop out for a few days and then go back to normal for the next month or so only to repeat the cycle again. A couple of months ago the rankings for this very blog dropped away suddenly and I later discovered I’d been hacked at exactly the time the drop started. However I didn’t receive the message in the Tools that we are led to believe is sent in such cases. I was lucky and found the problem with the help of a correspondent. Others may not be so fortunate if they rely on the messages.

A feedback form of some sort within the Tools would be at least partly self-filtering of the nutters and they could be easily ignored in any case since their sites would likely be flagged up already as dodgy. Of course that wouldn’t help much in the current case since many of us still can’t validate our sites, unless it was situated at the opening page of the account.

So come on Google, let’s come up with a method of sensible collaboration that will help both sides.

Comments are closed.