The mysterious Case of Google rankings

A couple of months ago there were rumours from the USA of case sensitivity in Google ranking results. This week I saw the first evidence of it in the UK results for one of my clients. What I saw initially was fairly dramatic (I won’t show the actual keywords but just the pattern) on a query which consisted of two keywords and a placename:

keyword keyword place – position 22
keyword keyword Place – position 14
Keyword Keyword Place – position 124

The following day the variation was rather less severe but there were still noticeable differences, as indeed there were with other clients’ results. It may take a while for the variations to settle down and we see the real picture, but if this is going to be a permanent feature of the rankings then it has considerable consequences for both clients and SEO practitioners.

For one thing we need to try to work out what the average user would type in when making a search. Personally I’ve always used all lower case in the past, and I suspect many others do the same, but where a town or city or proper name is included some people (possibly older people) may automatically capitalise the first letter of that word only, whereas younger people used to text-speak may not. Then there are acronyms – would you search for “UEFA Cup results”, or “uefa cup results” when looking for European football scores?

Most SEOs agree that the content of headings within your text content helps tell the search engines what the following paragraphs are about and thus helps the page rank for the terms within it. Is your house style to use “Title Case For Headings” or “Title case for headings” or are you following the fashion for “title case for heading”? What if your preferred style clashes with the way people search so that you rank better for “Purple Widgets” but everyone searches for “purple widgets”? It could be we just opened a Pandoras Box and pulled out a minefield.

Another problem will be what to report to clients. Now by coincidence I followed a discussion on Sphinn this morning where a number of SEOs argued that we shouldn’t report ranking positions to clients at all – that it is traffic and sales that matter and that with personalised search, geographical biasing, and variations in datacentres it all varies too much anyway. That would be great, I’m all in favour of stressing the end result as the important factor, but in practice most clients are pretty much hung up on their ranking positions and follow them themselves in a rather unscientific and ad-hoc way all the time. Who hasn’t had a call from a client who sees a fall of a few places on a search term and immediately wants an explanation? (Which version of the search engine were you using, UK-only or world-wide results, were you logged into an account, what other searches did you run today, 10 results per page or 100, are you wearing blue socks or brown, (wouldn’t you just love to say that?!) etc. etc.)

The fact is many of them don’t understand SEO and are desperate for any sort of number to cling on to in order to be sure that they haven’t hired a snake oil salesman who will bleed them dry without any benefit to their bottom-line. They insist on reports. And if they don’t then their Managing Director does.

But you can’t report on all the variations of multi-word search phrases – for every three-word term you’d have to check at least five alternatives – as the extra overhead in doing so would be enormous and would probably get your IP address banned for doing too many queries or doing automated queries. So if case-sensitivity has come to stay we’ll have to discuss with the clients which terms are the ones that they want monitored. That’ll be fun! What’s the betting everyone stops reporting on MSN results!!

One thing about SEO, it’s never dull!

Google settling down again at last

Having been away for a much needed holiday I’ve been catching up with developments for the last two weeks with fairly intensive search ranking analysis. Google’s recent bout of volatility in their results finally seems to be starting to even out. Since early May there have been some very strange results being reported on the SEO forums and they’ve been changing day to day and often hour to hour. Though other people have reported varying symptoms, what I’ve seen across a number of different sites has been that sites with top ranking pagesĀ  have seen little effect except on their weaker pages. Newer or generally weaker sites with more middle and lower ranking pages have seen those fluctuating substantially, with a big drop around the 3rd June followed by a slow climb back to normal, occasionally punctuated by sudden rises or falls which last a very short time.

Regular readers will know that I’ve seen this sort of episode many times before and that I advise a “don’t panic, continue as normal” policy. Reindexes are inevitable from time to time to sort out errors and clear out old pages. Spam fighting is also likely to be required occasionally and experiments are going on constantly – Google now say that they make about ten algorithm adjustments every week. One that I’ve often noticed changes in is the relative ranking of plural words versus singular. A few years ago if you ranked for one version then you almost automatically got the other, but now it’s much more dependent on which one your text concentrates on. Having diverged markedly about a year ago I’d been seeing the gap closing over recent months but now it seems to have opened up again. Another that I’ve noticed recently is that search terms that include the word “solutions” seem to have gone haywire, with some solid number one results suddenly dropping out of the top 500 position. Solutions is a horribly generic term that is vastly overused in business but such a drop while other results for related terms remain stable can only really be put down to an algorithm tweak.

Having had a bit of a rollercoatser for the last couple of months let’s hope for some rather more stable conditions in the run up to the Autumn and Winter seasons so we can all concentrate of developing our clients’ site quality rather than constantly checking if their rankings have disappeared. That is, after all, what Google tell us they want to see!

About Us pages

Ever since Google started mentioning using rel=nofollow as a possible way of diverting PageRank away from “pointless” pages there have been articles telling webmasters to use it on their Terms and Conditions and About Us pages. The implication being that About Us pages are not important. If any of you have ever read Michael Martinez over at SEO Theory you’ll know that he regularly rants against this practice.

He makes a number of excellent points, most of which I thoroughly agree with – particularly on the About Us page. It’s something I’ve been trying persuade clients of for a long time – so many of them either don’t see the point of an About Us page or assume that it’s a boring page of no importance, to be cobbled together as an afterthought while they work on those shiny new product images. The really sad ones stick their mission statement up there!

So what possible advantage could this much maligned page have? Well, consider someone who has a business that is complementary to yours. Maybe he’s looking to work in a new market area and has a partnership offer to make that would suit both of you. Or maybe someone in Europe is looking for a UK supplier of a product or service that you offer, because their old local one is no longer in business. These people will want to learn something about your company before contacting you so they can see if you match their working philosophy. They’ll more than likely look for your About Us page, they may well search precisely for that. And they may make a decision based on it!

Have something interesting to show them. Really tell people what your company is like, what it stands for, how many people are in it and what skills you have available. Where you operate from and to, where you have ambitions to.

As well as being useful searchable content it’s good PR and good branding. Don’t just talk about your products, that’s what the rest of the site is for, talk about *you*. As I’ve ranted about many times, it comes down to Trust. And Trust sells. So build it, starting with your About Us page

And for the SEO geeks amongst you; in case it isn’t obvious I DON’T recommend “sculpting PageRank” by nofollowing links to About Us pages.

Was your search really for what you entered?

There has been some speculation recently that searches are no longer isolated from each other in Google but that the previous search that you made may influence the current one, even when you aren’t logged in to any form of account. This follows an interview by Danny Sullivan with Google’s Marissa Mayer in April. This type of linked search has been used in providing targeted Adwords adverts for a while but it has never previously affected natural results.

Just how much of a difference this might make to results is impossible to predict as yet but it seems that it’s likely to mean that ‘standard’ results are less and less standard. Some webmasters are linking this to the recent strong fluctuations in rankings that have been seen – where positions have been markedly different within hours and sometimes minutes of queries being made – though personally I would be surprised if this were the cause of such major differences. That felt much more like a reindex or algorithm tweak, whereas I would expect the “previous query” effect to be either a relatively subtle refinement or alternatively a much more obvious shift of results where the earlier keyword has impinged on the later results.

It does however have other implications for SEOs reporting to clients. We’ve become used to clients seeing slightly different results due to connecting to different datacentres; if they now start seeing different results depending on a previous query then explaining such varying rankings is going to take even more time and persuasion

All change in the big three

Search rankings have been fluctuating even more than usual recently. First we had Yahoo doing a major algorithm change, so they were changing quite a bit. Then we noticed that MSN/Live had dumped most pages from its index and was respidering everything; and of course that meant that rankings disappeared for a few days while it was happening. And then we saw Google weighing in with some noticeable changes. It doesn’t seem to be one of their really big shifts, but a lot of sites seem to have been seeing drops from previously stable high rankings while poorer quality sites have temporarily moved up.

As I’ve said many times in similar circumstances the best thing to do is sit on your hands and do nothing you wouldn’t already be doing. Wait a couple of weeks or so and see what happens. As is usally the case I’m now seeing most of my sites returning to their previous levels, albeit there will inevitably be some datacentres which will reprise the poorer results occasionally until the system clears itself.

There does seem to be more day-to-day flux in Google results than was previously the case, so spot checks once a month are no longer sufficient to see the trends and variations. However at the same time you shouldn’t obsess with constant ranking checks nor should you concentrate on one or two favourite keyword phrases. Look at the whole site profile and remember that there is plenty of traffic to be had in long-tail searches. Those are often better at converting than the big headline phrases which often attract more generic searches that have relatively low conversion rates.

Google Webmaster Tools problem

And the wider issue of communicating with the search giant

For most of the last month or so there has been a problem for some sites in accessing the useful tools that Google makes available for site administrators. This manifested itself as a failure to verify the “ownership” of the sites using either of the two methods available, with an error message that varied between indicating a server timeout and a DNS error in looking up the site.

Unfortunately this wasn’t acknowledged as a problem until three weeks after it first started and at time of writing it still hasn’t been resolved on many of my own sites amongst many others. As a result many webmasters have been wasting time trying to solve non-existent problems with their sites and making pointless support calls to their hosting companies. At least in the last week there has started to be some individual responses from official Google staff to some of the postings on the relevant Google groups forum and this is a welcome development but it serves to highlight the fact that they are generally a very unresponsive company and getting hard facts out of them is extremely difficult.

Now to some degree I have some sympathy with their dilemma because if there were totally open channels of communication then they would be deluged with millions of queries and complaints, many of them half-baked or misinformed at best – we’ve all seen the nutters and chancers who complain bitterly about dropping rankings when their sites are riddled with blatant black-hat techniques and spam. However a way has to be found to allow genuine webmasters to report real problems.

With any system of the mind-boggling complexity of a global search engine there will inevitably be problems and bugs. But by not engaging with the webmaster community Google are missing a perfect opportunity to get exactly the sort of feedback that they need from people in a position to see the effects and give them early warnings of possible errors. No matter how good Google’s engineers are they aren’t looking at search results in the same intensive way that we are. Sometimes we’ll see puzzling inconsistencies in data that will ring bells for us, or we’ll see patterns when analysing SERPS results over an extended period. You can develop a sixth sense for when things are not quite right and this could be invaluable to them in tracing problems.

Remember the Big Daddy update? For months webmasters were baffled by perfectly good sites losing all ranking; of course there was a lot of noise from the less reputable as well but it was easy to tell that there were plenty of genuine people suffering. For quite some time the official line was that there was no problem and people should just clean up their sites and add more content. Of course many desperate webmasters ended up making major changes to try to get back some rankings and traffic to help their businesses survive. I myself lost a swathe of high rankings for things that I was clearly one of the most relevant sites for – not just dropped down a bit but dropped out of the index altogether – but was fortunately able to sit it out making no changes. A good while later we started seeing a particular datacentre with rankings that looked a lot like they should be, and then a few weeks after that all the datacentres had that data rolled out and all my top rankings returned. With better communications all that wasted effort, lost business, and vast quantity of forum chatter could have been avoided and maybe Google could have got enough useful feedback to roll out the corrected update a bit sooner. And they wouldn’t have lost so many friends and suffered such bad PR.

The development of Webmaster Tools was a great step forward but I’ve seen a number of oddities in it at times. For instance one client’s site was (quite naturally) largely based around two keywords yet one of them wasn’t listed in the “How does Googlebot see your site” section. This seemed bizarre since the same term was prominently listed in the link text pointing at the site, but it did raise suspicions about an apparent penalty they seemed to be suffering from when we took over their account. We tried emailing Google about it but received no response. That could have been an opportunity for a useful dialogue that would have helped us to ensure their site was clean and of good quality by knowing where to look.

Other issues come to mind. I regularly see a set of results coming round that is pretty obviously broken data – a range of ranking terms all drop out for a few days and then go back to normal for the next month or so only to repeat the cycle again. A couple of months ago the rankings for this very blog dropped away suddenly and I later discovered I’d been hacked at exactly the time the drop started. However I didn’t receive the message in the Tools that we are led to believe is sent in such cases. I was lucky and found the problem with the help of a correspondent. Others may not be so fortunate if they rely on the messages.

A feedback form of some sort within the Tools would be at least partly self-filtering of the nutters and they could be easily ignored in any case since their sites would likely be flagged up already as dodgy. Of course that wouldn’t help much in the current case since many of us still can’t validate our sites, unless it was situated at the opening page of the account.

So come on Google, let’s come up with a method of sensible collaboration that will help both sides.