What use are usability experts?

I feel a rant coming on… The other day I was preparing an email to a client about why it wasn’t a good idea to have lots of ‘click here’ links all over her site and thought to check a few of the usual places like the W3C and accessability sites to give her as references. My eye was caught by a description line in the search results which took me to a book review about usability. I won’t name the author as I’ve heard him make perfectly sensible points before but some of the stuff apparently in this book had me shaking my head in disbelief.

Now remember that I’ve been building sites since 1994 and have spent much of that time insisting to anyone who’d listen about the need for logical well structured navigation and ease of use, while as an SEO I insist on that as well as well-structured text in digestible chunks using attractive and comprehensible language. What I don’t advocate is dumbing down.

This author however clearly does, if the summaries and reviews are to be trusted. One of his suggestions was ‘halve the amount of text on the page, then halve it again’. Astonishing. Presumably he doesn’t want any search rankings – text is fundamentally what search engines index. Presumably he also doesn’t want well reasoned and informative content either.

We are informed that people don’t read web pages, but only scan them. Certainly a lot of scanning goes on, but when you find something useful then you read it. The scanning is largely part of the human search routines – we are presented with multiple possible sites when we make a search and we then visit them, scanning quickly through to see if they are relevant to what we’re looking for. But we are looking for good quality sites that have useful information, not for dumbed down summaries with no real value.

Another headline of the book was ‘don’t make me think’, along with the suggestion that it doesn’t matter how many clicks you have to make as long as it’s a mindless choice. What an appalling indictment of the assumed intelligence of users and a dreadful waste of the largest collection of information the world has ever seen! It seems to me as if a lot of this is driven by a view of the internet as just a massive selling machine with SEO seen as just a way to cheat your way to easy money. Given the true value of international communication and the hopes that were invested in the net in the early days, I fervently oppose such a view. Make your sites as good as they can be with your SEO’s help and you’ll have lasting value that will deserve to rank well.

While I’m on a roll, does anyone take Jacob Nielsen seriously? At the start he made some useful observations but now most of what he says seems more designed to maintain his guru status, and his examples have always been awful. He keeps telling us to keep it simple – good within reason – but then says do it like Amazon!! Only one of the most hopelessly cluttered sites around. It sells a lot of books because it sells them cheap – not because of the design.

Which brings us back to navigation – make it logical, not mindless. And never mind the usability gurus.

The 5 most mis-handled html tags

Since I don’t currently have a blog on the web design site I thought I’d drop this one in here. After all, trying to do SEO on top of poor foundations is a thankless task. It was inspired by a blog item which I “think” I saw on SEOmoz entitled The 5 most underused HTML tags, though now I can’t seem to find it. (Good blog incidentally, recommend it.)

HTML has undergone many revisions and fashions in its short life, and there have been a number of blind alleys and wrong turnings on the way. This has resulted in the complete misuse / miscoding / misunderstanding of many of the tags which give the markup language its tools. By this I don’t mean those horrible tags like blink or marquee which made so many sites unbearable in the late 1990’s, but the ones that competent designers should know how to use.

Unfortunately some of the WYSIWYG programs which appeared when website building became popular were responsible for lulling people into thinking that they didn’t need a solid understanding of HTML. Some of them made it hard to even see the underlying code while many of them produced code that was at best convoluted and sometime completely invalid. Only the fact that the most popular browsers were far too forgiving of poor code allowed the resultant sites to get away with it.

So here we present:
The most mis-coded HTML tags
The most misused HTML tags
The most misunderstood HTML tags

The table tag

What’s so wrong with the table tag I hear you cry. Nothing much in itself, perfectly reasonable tag, I don’t even mind too much if you use it for positioning (though it wasn’t intended for such things) but please, don’t give it attributes that are invalid – mainly height. It’s not good code and mostly it proves you’ve used a WYSIWYG program without understanding what it does. Secondly, do try not to nest them more than 3 or 4 deep – if you do that then you are probably trying to create positioning that should be handled using CSS.
And for goodness sake don’t try and put an absolutely positioned div inside one.

The meta keywords tag

The spammers favourite. Stick every keyword you can think of in there, throw in Britney Spears, Paris Hilton, and the kitchen sink. Ignore the fact that everyone except Yahoo stopped paying attention to it (except to penalise spamming of it) years ago and even Yahoo check to see if your body text includes the terms and penalises you if it doesn’t.
Then wonder why your site doesn’t rank…

The font tag

C’mon guys, they announced that this was being deprecated back in the 90s. The occasional bit of in-line use might be forgiveable – building new sites using it all over the place for basic formatting in 2006 is just amateurish. If you can’t even learn CSS1 then stop claiming to be a web designer and then we can dump this one into the bin with “blink”.

The span tag

It’s not a div tag, you can’t have block level tags inside it. It’s for in-line styling within tags such as p tags. It’s for exceptions to the general rule, not to be used all over the place.

The form tag

Probably the single most miscoded tag in the world. If you must mix it with tables either put it inside a table cell (td tag) or outside a table tag – do not put it round a tr tag – not only will it throw up an error on that tag, it’ll produce knock-on errors that will have the validators running in circles. And it’ll make any attempt to use CSS to effectively style the form into a horrible mess.

Wouldn’t it be nice to see these being used properly in 2007!

Happy Hogmanay and a Good New Year to you all.

Google problems and MSN weirdness

Since around August there has been an intermittent problem for .com sites with Google results in country specific listings (e.g. searching Google.co.uk and selecting UK-only results). What has been happening is that the home pages of these sites drop out of the country specific indexes, although they are still there in the global indexes. (To check if you’re affected simply do a site:www.mydomain.com in the two versions of the results) Along with this drop-out the rankings for any search term that is in any way reliant on the home page drop as well, and this has been causing problems for many businesses. A strange symptom that also often appears in these cases is that an https version of the domain address shows up in the indexes, even though there is usually no such version in existence.

There’s been a fair bit of discussion of this in the forums (for instance on webmasterworld) and those affected have reported that their sites periodically reappear and then drop out again, which agrees with what we’ve seen on some affected sites that we monitor. By no means all .com sites are affected and various theories have been tried and proved faulty as to what might be triggering these problems. The general opinion seems to be that at some point Google tried to introduce some sort of geo-specific filter which has gone wrong, and have subsequently been trying to fix it, however none of us really know and Google aren’t saying anything, even in reply to specific questions.

Sadly this is par for the course and is difficult to understand. We saw the same thing last year when what is now called the Big Daddy update took place. At that time many perfectly ethical sites dropped out of the indexes for a number of months while many spammy sites seemed to reappear. Eventually the update was rolled out and most of the good sites came back and reclaimed the rankings they had held previously. One of my own sites suffered this effect, even for some very specific terms for which I was clearly amongst the most relevant sites. At that time there was a great outcry but very little if any useful feedback from Google – just the usual generalities about making sure you do all the normal ethical things.

Now you can understand Google wanting to distance itself from making too many direct answers because they must certainly be inundated with queries from both ignorant users who can’t manage to read the T&Cs, and spammers who are trying to get back into the rankings after being banned. However when people have legitimate questions and there are provable problems with the results you would think that it would benefit them from a technical point of view as well as a PR point of view to be a little more communicative. While the sitemap / webmaster interface is a step in the right direction there are still many questions which are unanswered. SEO clients, who are often big businesses themselves, cannot understand how their SEOs can’t just phone up Google and get answers. The apparent climate of secrecy contributes to the “smoke and mirrors” reputation that SEO has.

The current situation only gives credence to the opinions of those cynics who point to the fact that these major index drops always seem to happen in the run-up to Christmas and are deliberately engineered to boost Adwords revenue. The best way to scotch such suggestions would be for Google to be more open about what is going on and admit any problems when they occur. Even if nothing can be done to speed up the solutions it would ease the minds of those affected, stop them tinkering with their sites in ways that may well be harmful in the long term, and greatly increase their respect for the company that was founded on the idea of “do no evil” but whose reputation is no longer seen as whiter than white.

Google aren’t the only ones to have some strange results at the moment. MSN / Windows Live is also doing some odd things. While checking my own listings I noticed that if you go to msn.com (which these days no longer redirects you to the uk version) and search for “search engine optimisation scotland”, the top ranked site is a holding page for a forthcoming site and contains no content apart from a handful of Google Adsense ads. Surely some mistake!

Trust

I’ve been thinking a lot recently about trust, and it seems to me that this is a concept that applies in both Search Engine Optimisation and in successful web site construction and interacting with users/customers.

To get Google to rank you highly you have to persuade it to trust your site. Trust it not to attempt to spam the indexes with cheap tricks. Trust it not to duplicate or steal content. Trust it not to join any link farms or other dubious linking arrangements designed purely to raise your rankings. We even talk about “trustrank” as a concept superceding PageRank. If a site has done any of these things then that trust is reduced and you have to earn it back again before you’ll rank well for the terms involved.

Similarly with users. If you want them to buy from your site or follow up on your services then you need to pursuade them that you’re trustworthy.

You do that with your design:
Does it look professional and geared to the subject, or does it look like someone knocked off a cheap template and stuck some text on it? Does it assist the user in finding what they need or is it just showing off the skills of the guy that did the Flash intro that takes a minute to load and wastes everyone’s time?
Does your navigation work effectively and cleanly? If not then why would anyone trust you to build a checkout system that they have to give their credit card details to?

You do it with your text:
Does it address the user in an engaging manner and tell them succinctly what you can do for them and how they can find out more? Or does it throw reams of over-hyped sales-speak at them, or maybe paragraphs of generalised business jargon of the type I’ve complained about before.
Is it transparent about who you are and where you’re from? Is it easy to contact you if they have a problem?

If you don’t tell them what they need to know about you and your services then they’ll go elsewhere because they don’t trust you to deliver.

Next time you look at your site pretend that you’ve never seen it before. Would you trust it?

Out with MSN – In with Windows Live

Windows Live, the planned search system that will be integrated with the desktop in the next version of Windows, has now come out of beta (though at the time of writing the URLs generated by the searches still have “beta” in the address!)

This seems to herald the next major change in Microsoft’s attempts to compete with Google in the field of search. MSN results have been more than a little erratic in the last few weeks and the suspicion is that they were moving over to the new results generated by the Windows Live search engine. In some ways this is a pity, as the MSN results were refreshingly direct in their approach. If you had the content then you could get the ranking. There are times when the amount of anti-spam effort that Google puts in, plus their link-based emphasis, makes their results seem over refined and biased towards commercial sites – they seem to miss interesting new sites with fresh content for far too long. We must hope that MSN/Windows Live doesn’t throw the baby out with the bathwater.

The other aspect of Windows Live is the local search capability. It’s probably too early to tell how effective this is – a check from an Edinburgh base was throwing adverts at me for London and Australian SEO companies but that may just be teething troubles. They seem to have an interesting method of tracking where you are situated by using Wi-Fi data or IP address, though privacy concerns may make some users reluctant to sign up for those.

Posted in SEO

Let’s be realistic…

This last month’s trend seems to have been for calls to come in asking if we can get rankings on some incredibly generic keywords. Single words, very common products, phrases for very common software packages, phrases for well-known companies which the potential client sometimes works with. In one case not only were the phrases in all the above categories but most of them weren’t even mentioned on the site!!

Now the simple fact is that this sort of result really isn’t at all realistic. Nor in many cases is it even desirable.

Single word phrases – most people don’t use them. A recent survey reckoned that only around 10 or 11% of searches were on single terms.

Very common products – unless you already have a site of fairly long standing which ranks moderately for such a term then your chances of getting good rankings are very poor. You need to find a niche market and niche phrases to go with it. In any case how many people search for generic products? Would you search for “cosmetics” or would you search for your favourite range or individual named product?

Common software products – do you really think it’s likely that you can get a top ranking for “Microsoft Office” when there are millions of articles about it, advice pages for it etc. etc.

Well-known company names – it certainly won’t be easy to get rankings for, let’s say, Ford, just because you did some work for them or are a supplier. The top slots will already be taken with their sites and articles about their products. In any case what’s the point of ranking for their name unless you sell or service cars. People who search on Ford are generally interested in cars, not on an accountancy firm who worked for them or a graphic designer who did some brochures for them, or whatever. So the chances of getting useful traffic is very slim even if you do succeed in getting the rankings.

And the chances of getting the rankings? Well, at the very least you’d need to have pages and pages of text concerning that product or company in order to be remotely relevant for it, and you’d need lots of links from strong related sites pointing to your pages using the keywords in the link text.

Remember that useful traffic is what it’s all about at the end of the day – good rankings are useless in themselves unless they produce traffic that will hang about and do your company some good – whether in simple monetary terms or in PR.

So think about what sort of traffic you’re aiming to get before choosing your keywords. And think about what’s realistic in a crowded search marketplace.