Mission Impossible 4 – Revenge of the Black Hat

I had an interesting phone call yesterday. A polite young man suggested that my SEO skills were in great demand in London, Dublin, and Gibraltar and I would be looking at a salary of £50,000. Now call me an old cynic (you’re an old cynic! .Ed) but I didn’t jump for joy at the prospect of oodles of cash and tons of sunshine (or even Guinness) ‘cos I’d already guessed what was coming. The catch? It would be in the online gambling industry!

Fact is there are some industies that are just not worth the effort to optimise for organic listings, and gambling / betting is one of them. There are a number of others – loans and bad debts, mainstream sex sites, mp3s, mobile phones for instance.

So what’s the problem with them?

  • Impossibly competitive markets
  • Bad site history or no history
  • Black Hat opposition

Competing for rankings with the big boys

For all these areas there is an astonishing number of competing sites. Some long established and reputable – think big established betting shops, the big banks and loan companies, etc. – and some decidedly not reputable. The big guys can afford to spend massive amounts on advertising and site building, have usually had sites for many years, and those sites are usually big enough to cover just about every useful keyword available. The others guys cheat.

Domain name problems

If you’re entering one of these markets then you have two choices – start with a new domain name, or buy an established one. In a highly competitive market a new site on a new domain name will take a very long time to rank. Google aging filters will apply, and the usual remedy of getting links from trusted sites will be more difficult to apply. However if you try buying a domain name that has been operating previously then you may run into bad history problems. These appear when the previous owner has employed unethical techniques in trying to rank. If they’ve employed cloaking, or hidden text, or been connected to link farms, then this may very well affect your attempts to optimise the site because of the need to establish trust.

Competing with Black Hat techniques

What about the smaller opposition? A substantial number of them in any of these big money industries will be short term sites which are there to make a fast buck and then drop out again. They don’t care about long term results so they are happy to employ unethical or Black Hat techniques which may well get them highly ranked in the short term but will get them banned once the search engines work out what they’re doing. Because there is a constant stream of these there are always some of them competing against the ethically-built sites, making your job far more difficult than in a more normal business sector.

So I won’t be going to Gibraltar – pity, considering this year’s British Monsoon season summer. Anyone fancy a Guinness?

Posted in SEO

Google Supplementals and the Dungeons of Doom

[edit – I must be psychic – the same day that I wrote this post the Google Webmaster Central Blog announced
Supplemental goes mainstream. Seems they don’t agree with my reasoning and are determined to make it harder for us. Only time will tell if the results from the two indexes really do come together, but certainly the comments on that announcement seem pretty skeptical.]

There’s a lot of discussion at the moment about Google’s removal of a search method (though it looked more like a hack) which was supposed to allow you to check which of your pages were in the supplemental index rather than the main one.

It should be said that there was always some doubt about this method – it seems to have thrown up inconsistent results and there was a suggestion that it was possible to appear in both indexes at the same time so the results could be misleading. No wonder supplementals are a source of confusion.

At various times we’ve had Matt Cutts suggesting that we’re all too hung up on supplementals and even that Google might remove the supplemental indicator from the site: command results. [edit – which is what they’ve now done] The latter seems like a very retrograde step as it’ll leave people even more in the dark than ever about what Google thinks of their pages, and it will simply fuel rumours of the sort that cause so many problems for those without the experience to follow good practice or the money to employ professional help, rumours which Matt himself will then have to debunk.

So are we all hung up on supplementals? Well there seems little doubt that a supplemental page will not rank in a competitive field. I have recently seen a revamped site which had poor connection to its detail pages. To rememedy this an intermediate page was built which contained a great deal of information and linked to subpages which again had lots of good quality information around which the deepest detail pages were clustered. When this intermediate page first went live it immediately started to rank for a number of terms, and the subpages also started to rank. However a few weeks later the intermediate page dropped into the supplemental index, as did the subpages; the rankings immediately vanished, and have not returned.

Such a situation seems to lack natural justice and is likely to cause confusion amongst reputable webmasters who are trying to produce good sites – which is what Google claim to want. Since we don’t know exactly why pages can go supplemental some people will speculate and be tempted to take actions that may not be to the benefit of their users. Cutts has hinted that one of the reasons is lack of PageRank. That sounds dangerously like an invitation to point lots of links at the pages concerned as well as saying that quality is secondary to link strength. And then they wonder why people go out and buy links!

We can also look at the other end of the problem – what if we allow for the moment that there are perfectly good reasons in the algorithm for downgrading a page, and let’s say that a particular page isn’t good enough, isn’t worth ranking in the main index. Shouldn’t a conscientious webmaster be given a method by which they can check which pages of their site are regarded as poor, so they can take action to improve them?

Come on Google, give us the information we need to improve our sites, and then reward those improvements accordingly. Happier webmasters, better sites, better quality search results. Isn’t that what we all want?

Does SEO = Google Optimisation?

Why should you care about Yahoo and MSN?

Increasingly I hear prospective clients say that they are only interested in Google results and nothing else. However, while Google are way out of sight in traffic referrals and both Yahoo and MSN/Live Search can be erratic and unreliable, I feel this is a bad approach to search engine optimisation in general.

There are two reasons for this:

  • Over reliance on one source of traffic
  • Narrow thinking in terms of site development

In the first case you have no fallback position if it all goes wrong. What if Google, in its attempts to eradicate spam, make another major algorithm change such as has happened a few times over the last few years? Even if you’re innocent you could still be caught up in the fallout and lose key rankings. That’s always going to hurt, but if you’ve no rankings elsewhere it could be fatal to your business. Rankings on Yahoo might just keep you going until Google works out the bugs.

The second case is to my mind even more important. If you’re constantly thinking about Google and what they’re doing then you’re not thinking about your customers. Your mindset may increasingly be to look for tricks and techniques that chase rankings rather than pursue good design, good content, and good usability. If that’s where you end up then you’ll find your conversion rate dropping because your customers no longer like your site.

SEO shouldn’t be about making poor sites rank above their level, it should be about building a site that deserves to rank well because it’s a good user experience, and then making sure it achieves that deserved ranking.

If you can’t Digg it, Sphinn it

Danny Sullivan’s Search Engine Land branched out into a new area at the weekend – social networking for SEOs. Since the denizens of Digg decided to bury any vaguely search related topic that came their way in the misguided belief that all SEOs were spammers, there’s been fewer places where you could have search community news and discussions without subscribing to a wide range of blogs or RSS feeds. So Danny’s answer is to build his own version – Sphinn (and yes, no-one seems too sure how to pronounce it).

Danny is widely regarded as the top SEO guru and I’ve no doubt that most decent SEO’s will be queuing up to join anything that has his name on it and which helps their articles gain visibility. Bet the guys at SEOMoz wish they’d thought of it first 😉

Pity the Americans won’t understand any Alastair Campbell jokes…

Posted in SEO

Fears of search fragmentation in the USA

PPC experts in the US are worried that recent attempts by the search engines to expand their services and the range of displayed results is causing a drop in the effectiveness of Pay Per Click advertising. This is the paid for results that are shown on the right hand side of most search engines’ results and in some cases at the top and bottom in pale coloured boxes. It seems that click through rates (CTR) have been unusually low for the last month.

The theory, and it’s a fairly convincing one, is that the maps, images or videos which are being displayed above the main organic text results take the eye left and down so that viewers hardly notice the PPC results at all.

If this results in advertisers losing faith in PPC then Google in particular may find their revenues dropping. Will this cause them to rethink Universal Search or will they adopt another approach such as redesigning their results pages? For so long their clean interface and layout was part of their attraction but it’s becoming gradually more cluttered.

So far we still haven’t seen much of the Universal style results in the UK – I wonder if this is because we’re lagging the US or if there are second thoughts creeping in?

Making sure the spiders can get in

One of the most common causes of a site having poor search engine rankings is the search spiders not being able to penetrate part or all of the site. If the site isn’t fully indexed then it can’t work cohesively to provide relevance via its links and the most important pages may be missed altogether.

There are various things you need to make sure are in place to enable search engine spiders to properly index the site.

1. Link to the site

Make sure they can find it – you need a link from a site which is itself regularly spidered and preferably has some degree of rankings for its own keywords. Forget about submitting the site, and particularly ignore all those offers to auto submit it to ‘5000 search engines’ or the like. Even if there were 5000 search engines it would do more harm than good.

2. Provide useful content

Usually the home page will be the first page a spider will find. Make sure it finds something worthwhile. Dont give it a splash page with no content or it may decide there’s nothing worth looking at. Flash looks like no content, so does an image or some other kind of animation So does a frameset containing another site. Show it good quality text saying what the site is about, and provide appropriate meta information about that page – not the rest of the site.

3. Spider-friendly navigation

Now that it’s found some content make sure it can get to the rest of it. Good text based navigation is a must. Text links aren’t just easy to follow, they also pass relevance. Image based buttons can be followed if the link is formed correctly but pass no relevance on themselves and very little on the alt text compared to a text link. That doesnt mean the links can’t look like buttons some of the best ‘buttons’ are CSS styled text links.

If your site uses forms extensively then make sure there is also a way of bypassing them for the spiders to follow. They can’t fill in forms or select from a drop-down list, so those jump menus are no use for them either

4. Provide friendly URLs and filenames

If your site is based on one of the common ecommerce packages or is database driven then you need to ensure that the page URLs that are produced are friendly for the search engines. Session ids and long complex query strings with multiple parameters can cause the spiders to avoid the pages aimed at. If you have an existing site with these problems then you need to consult a developer or an SEO with experience of these issues.

Cover these basic requirements and you will already be well on the way to achieving good results – assuming of course that you have good content!

Posted in SEO