Constant change needs constant rethinks

The last few months have been a whirl of business development, networking, new clients and juggling time constraints – not to mention trying to keep up with Google’s ever-changing activities.  So maybe it shouldn’t come as a surprise to read my own main website and find loads of things that I need to rewrite and re-evaluate.

In some ways it’s the old “cobbler’s children” situation; always working on other people’s sites leaves little time or perspective for your own. In other ways it’s the nature of the SEO business – I often say that if you go on holiday for a week then you have to spend 2 weeks catching up after you get back. And there’s also a bit of change now that I’m fully independent rather than making decisions with others. But the longer I spend in this business the more often I realise that we’re constantly learning and adapting to changing market conditions and that ideas and tactics that were gospel 6 months ago are now totally different. You have to step back and remember that increasingly often, lest you find yourself saying something you no longer believe.

So I’ll be doing a fair bit of thinking about how I describe my approach to web optimisation and marketing and much changing of the site will ensue to reflect my current strategies.

There’ll be lots of rethinking being done at the moment following the major changes Google has been rolling out recently – particularly the new personalised search change which could potentially revolutionise the search market. I’ll be writing about that and the effects it could have in the very near future, but for now it’s off to do some strategic thinking, editing and site analysis.

More Google Geolocation oddities

Sometimes Google really puzzles me. I just did a search from the UK, using Google.co.uk, selecting UK-only results, for “chatsworth bed”. (I was checking to see how Chatsworth bed was doing against the competition.) The Local results that Google inserted into the serps after the third natural result were:

Local business results for Bed near Chatsworth, CA, USA

Say Huh??? Why would they possibly think that would be relevant to me when I asked for UK-only results?

I was using Firefox. Strangely when I repeated the search using Safari there were no local results at all. Very odd.

Google through the looking glass

My last post discussed changes in the search landscape but it seems Google is busy changing both the business landscape and the real landscape.

A couple of weeks ago we had SEORoundtable reporting on them producing some very strange results in their search engine – searches in Google for “Google Ireland” were producing Google New Zealand as the number one result. Other people soon chimed in with similar results in South Africa, and South America. All good fun and most of the examples I saw then have been sorted out by no-doubt embarrased Google engineers.

This morning I was reading a somewhat mind-boggling business analysis by Bill Gurley of their maps business and the effects it’s having on GPS companies in the US. If Bill’s right then the effects on mobile phone companies as well as GPS ones could be considerable. The sort of “Less Than Free” business model he describes, if expanded to netbooks and other places their operating system and maps technology could have application, could put them in a position to topple Microsoft from their software domination and put them in an impregnable position to serve adverts to just about everywhere.

But what are we to make of the third item which was highlighted by SearchEngineLand today? It appears that Google Maps has made up a fictitious town in England called Argleton amusingly described by nearby resident Roy Bailey. Is this a “trap location”, a careless mistake by their mapping partners, or a wicked English attempt to usurp the position of Brigadoon as the world’s favourite disappearing village? 😉

Still feeling confident about following those GPS directions? Think I’ll stick to good old Ordnance Survey!

Bing and Twitter could shift the search landscape

As Danny Sullivan has just announced Microsoft’s Bing is launching has launched a Twitter Search system www.bing.com/twitter which includes a front page featuring a hot-topic cloud and list of top tweets.

This gives Microsoft a major boost in the increasingly important area of real-time search, an area in which Google is perceived by some to be at its weakest. With Bing soon to be providing search results to Yahoo this marks the beginning of a new phase in the battle for search market share.

In recent years Google has been supreme – in the US they are approaching 70% of conventional searches and in the UK it’s around 80% – but now it seems that the landscape could be moving under their feet. Real time search is becoming the “in” thing and here it’s Twitter that has the most up to date data streams, while in the area of people search and social connection Facebook contains a massive amount of data that is mostly hidden from search engines.

It’s not yet clear whether Twitter’s data will also be available to other companies, including Google, but certainly Bing has stolen a march here and if it can become seen as the standard place to undertake searches of this type then it could swing the balance of power towards Microsoft. The next few months could be very interesting indeed.

Webmaster Tools drops the pixie dust

In the last few days there have been some additions and one removal from Webmaster Tools. The removal is in some ways the more interesting for what it might signal for the future – Pagerank is no longer being mentioned.

Now it was never much of a feature – they told you if you had a lot, a medium amount, or not very much PageRank, and they told you which of your pages had the most. Not exactly rocket science that last one, given that it’s very unusual for the home page not to be the one with most PR. The assumption was that the bars indicating the amount of PageRank were the same as the PR on the Google Toolbar, thought I’m not certain if that was ever explicitly stated and there was never any use of the numbers that appear on the Toolbar.

Barry Schwarz on Search Engine Land believes that they should drop Toolbar PageRank as well to prevent webmasters obsessing about it. In many ways I agree with him although in my experience it’s clients that are the ones that obsess about it – them and the crazy link builders who send you emails requesting link swaps only from high PageRank pages. However in some ways we’re all correct to obsess about PageRank, but the real stuff rather than the poor out-of-date approximation that the toolbar shows. Because it’s still the lack of PageRank that prevents perfectly good content from ranking by dropping the page it’s on to the supplemental index rather than the main one. The fact is we need a genuine indication of PageRank so we can get an idea of how it’s being distributed across our sites.

I’ve recently taken to showing the SEOMoz mozRank in my Firefox status bar (using the SearchStatus extension) instead of the Alexa rank which I find less useful. It’s interesting to compare it to the Google version because sometimes the Google figure shows zero on pages that you’d expect to have a figure of 2 or 3, and you’re never quite sure if there’s a problem with the internal linking structure or if it’s just Google being obtuse. So far the mozRank figures are suggesting there’s nothing wrong and the link strength is being distributed as expected, but when you have a page that isn’t the one that’s ranking in Google for it’s main target keywords then you have to wonder if the toolbar is telling the truth and they’ve dropped it to supplemental status. What you don’t know of course is why. The thing is, if the internal linking isn’t giving the page enough strength for some reason known only to Google then the only recourse is to try and get someone else to link to it for you, and that is one good reason why some people have an obsession with PageRank.

To be honest, Webmaster Tools aren’t as useful as they used to be. Since the changes they made a few months ago the external link results have been both much lower than previously and very inconsistent, while the internal links have been showing some very odd figures – often the home page is being shown as having far fewer links to it than some other pages despite it being actually linked to from all the pages of a site. Such results make it hard to have any confidence in what we’re being told and harder for genuine webmasters who want to build good sites to do so. There’s no point in telling us, as they perpetually do – that we should just build good content when very often that good content ranks far behind some spammy pile of scraper droppings simply because of a lack of PageRank and we have no tools to measure what’s going on.

And then they wonder why some people are also obsessing about link building!

Who’s side(wiki) are they on?

Google have now released something they’ve been beta-testing for a while – SideWiki is a commenting system built into the latest incarnation of the Google Toolbar. Basically what is does is shows a sidebar in your browser where it lists any comments that anyone makes about your site. Systems designed to do this sort of thing have been tried before but never by anyone with the marketing power of Google before.

It’s not entirely clear why they think this is a good idea – in the face of initial criticism the attempted defence by Matt Cutts that you can use it to warn people off a spam or scraper site is perhaps the weakest pronouncement he’s ever come up with – but if it does take off then an awful lot of webmasters are going to be very unhappy about it.

Basically it presents content that you, as a webmaster, are not in control of to users of your site, and there has to be something fundamentally wrong about that. If someone wants to criticise my site on their site then that’s their right (as long as they don’t cross the line into libellous areas). If they want to criticise using my own commenting system then I have some control over that and can reply to their comments. But to present content as part of the experience of visiting my site, without my permission and with no means of me responding or complaining is simply hijacking. There could be links to competitors sites, there could be spam, there could be adverts, there could be objectionable material such as racist comments. The average user simply won’t understand that this material is not mine and as always mud will stick.

Sorry Google, you have got this horribly horribly wrong. I hope it falls flat on its face the way the earlier systems did, or that you reconsider and withdraw it.

For the sake of the web in general – which you yourselves described as a cesspit – it’s an open door to the worst kind of spam and dirty practice.

For the sake of all the genuinely ethical small web businesses who already have far more to do than they can handle to keep themselves going in the face of spammers and scrapers.

And actually for the reputation of Google itself, because this will be a massive PR blunder in which you are seen to be setting yourself up as the supreme ruler of the internet and doing something which is directly against the interests of the people who build and maintain websites.