Just how temporary are blog rankings?

I noticed a curious phenomena in my Google rankings this week. While my general rankings based on the main site are doing well and mostly improving, all the terms that were ranking via this blog have dropped out of sight. Now I haven’t been blogging as much recently due to other committments so now I’m wondering if the drop is simply a Google glitch that will recover as they generally do, or some more fundamental change in the algo that has hit my blog results, or if it’s simply because my content isn’t being updated as quickly as before.

If it’s the latter then I wonder if we’ve all made a rod for our own backs in that we need to keep on writing new articles more and more or face the consequences. As I blogged about in my Coals to Newcastle post I’m reluctant to churn out the same stuff that everyone else is doing because that feels wrong and pointless. On the other hand I’m a bit miffed to find that the top rankings I had forĀ  terms like “Scottish SEO consultant” and which I blogged about in Despairing of Google, have collapsed down to about 280th place.

Anyone else seeing a blog dropping out of the rankings?

A question of balance

Which SEO techniques are ‘dead’ this week?

There seem to have been a plethora of articles in recent weeks about all the SEO techniques that allegedly don’t work anymore. Whether it’s reciprocal links (seems like that one’s been dying for years), free directory submissions, paid directory submissions, link bait, paid links, or whatever (funny how these are all about links – as Michael Martinez would say, if all you think about are links you’re not doing SEO!). Usually these ‘stories’ are based on some Google tweak, real or imagined, news of which then spawns endless ill-informed speculation and forum posts, and often a load of whining; the latter usually from people who are trying to game the system and aren’t happy that their favourite method (or maybe only method) has been downgraded.

The truth is rather different. Few of these techniques are dead, as long as they are used in a balanced way. Of course if you rely on a single technique then you deserve all the grief you get when it goes down the tubes. The sensible and ‘ethical’ way has always been to maintain a natural balance. It’s natural to have some entries in directories – that’s what they were invented for. It’s not natural to have 1000’s of them and no other links. Nor is it natural to have links in directories that are based on totally different subjects – why is your property site in the middle of sex and gambling links?

It’s natural that some sites will be sufficiently similar that you each link to the other, and it’s natural that there will be cases where you are sufficiently grateful that someone has linked to you that you want to link back. It’s not natural if all your links are reciprocal.

At the end of the day the search engines (in theory) are looking for quality sites. Such sites usually have a natural balance of all these factors. An imbalance of any of them is at best a warning to the algorithms that a site may not be entirely kosher and at worst may show a concerted attempt to gain unfair advantage.

Posted in SEO

Out of the Hellmouth – maybe

Google announce the end of the supplemental index

In a curiously timed announcement, just when many people are turning to thoughts of Xmas rather than SEO, Google have said that from now on all search queries will access the full index for relevant results, effectively ending the problem of good quality pages being ignored because they’ve been relegated to the supplemental index.

Supplementals have been a serious problem for the last few years for many sites which have either a deep navigation structure or insufficient inbound links – both situations that result in pages with a low PageRank; which is known to have been a trigger for dropping out of the main index and into the supplemental one. Because Google was only employing the supplemental index when there were insufficient relevant pages in the main index for a query, any pages with rich content but poor PageRank were effectively being excluded from achieving good search results.

Some commentators have suggested that the supplemental index goes hand in hand with PageRank to enable a fast enough response time to queries – Dan Theis has a blog posting on the subject which makes interesting reading and it’s intruiging to wonder what Google have been doing under the hood if his theory was right and how they’ve overcome the performance problems it suggests.

So far I haven’t seen any evidence of change in search results, but that may be because I’ve been used to designing with a view to avoiding the problem. However I’ll be revisiting a number of sites which I know have suffered in the past to see if matters improve for them.

It has to be said that a number of SEO commentators are sceptical about whether there will be any real benefit or if it’s just window-dressing. Andy Beard is one of those who also isn’t seeing any change in the SERPS. However to be fair there may be a delay in getting any changes rolled out through the vast number of data centres that Google uses. One thing I’ll be watching closely is the listing of internal links in Webmaster Tools, because I’ve long suspected that this only lists pages that come from the main index. If that is true then we should see all the pages in a site appearing in that listing. I’ll keep watching this and return to it in a few weeks time.

Russians improving the search interface

Earlier this week over at AltSearchEngines the title of alternative search engine of the year was awarded to Quintura. Strictly speaking it doesn’t seem to be a search engine in its own right but rather a search interface, and that’s where its value lies.

Quintura is of Russian origin, and it makes use of that design feature beloved of Web 2.0 bloggers – the cloud. But instead of the static and to my mind fairly pointless version we’re used to seeing, it uses the cloud idea in a dynamic way, interfacing it with the search box via some fairly heavyweight artificial intelligence algorithms. It currently uses a Yahoo xml feed for its main web and image search, plus Blinx for video search and there is also an Amazon search facility, but from the description of the associated desktop search it looks as if it can work with others too. They have a Russian language version which works with Yandex results.

Searching by Clouds

Quintura cloud interfaceBasically you enter an initial search as normal and the usual results list appears in the right half of the screen. On the left (or optionally above if you reposition it from the settings menu) the cloud appears with your initial search terms in bold red in the centre. Around it are clustered some other related terms and if you hover over one of them that word is temporarily added to your base search phrase (you’ll see this happen in the search input box) and the search results update accordingly; as does the cloud. Hover over another word and that one replaces the first addition, while if you click on it the word is added to the search query as a persistent addition so you can then refine further without losing it. Hovering over a word also causes a red x to appear beside it and by clicking the x you can exclude that word in the same way as using a minus sign would in the search box. Double-click in a blank area of the cloud and a text box appears so you can add a word that doesn’t appear in the cloud.

All this takes longer to describe than to do – once you get used to using the cloud you find it’s a fast way of refining results without taking your hand off the mouse.

Because the cloud interface is so intuitive for children there is a separate childrens version and the founders of the company are apparently planning a version specifically aimed at mothers.

Although the cloud interface takes up a fair bit of screen space, given that most search results leave a lot of white space anyway it is less intrusive than you might expect, and all in all this seems like a useful progression in search interface design and it’s one I’ll be investigating further.

I certainly find it more useful than the various attempts at ‘universal search’ that we’ve seen so far – most of those seem to get in the way of seeing normal results by cluttering the space up with other media or somewhat doubtful local results. Quintura’s beauty is that, like all the best ideas, it is essentially simple in operation for the user. At the moment it’s still sporting a Beta tag so I look forward to seeing whatever future developments they can come up with.

Despairing of Google

Sometimes I really do want to beat my head off the keyboard! Like many others who may be reading this I epouse ethical search engine optimisation techniques. I advise clients not to use any of the spammer’s tricks and to produce quality content. I tell them not to use hidden text or dodgy redirects or spammy meta-tags, that Google and the other engines will pick that stuff up and kick them out of the indexes.

And then what do Google do? I’ll demonstrate.

For quite some time I’ve ranked number one in Google for “Scottish SEO consultant”. I’m not precious about it, there are some excellent SEO consultants in Scotland and I’m happy to be in amongst them – the phrase happens to be in my blog title so that gives me an advantage on that one out of scores of other similar phrases. Tonight I checked some of my rankings as Google has been erratic again recently (coincidence with the big PageRank debacle?) and saw that I was down to number six for that phrase. No problem, noticed that Shaun Anderson’s Hobo site was above me, that’s ok it’s a damn good site. A bit peaved to notice that above him was a company in Derbyshire (with a greyed out PR bar!) whose only connection to Scotland seems to be that they did a site for a Scottish property company.

But what really got my back up was the site at number one –
Work at Home – company scotland seo directory
www.easywebcreator.com/workathome0/company%20scotland%20seo

This is basically a directory search result on a site that advertises franchise businesses and is crammed full of Adsense ads and padded out with Google News items. But that’s not what really bugged me. If you visit that page go and view the source code and look at the meta-keywords tag. For those of a non-technical disposition or who would rather not visit it I’ll describe it. I copied it into a Word document to check the number of characters. It filled 9 pages! It has 5622 words, and 41,653 characters!! These include such relevant terms as “abraham lincoln”, “angelina jolie”, “Black Sabbath”, and yes, you guessed it “Britney Spears”.

This is what the world’s most popular search engine thinks is the most relevant site for “Scottish SEO consultant”. A company full of the finest brains in the world who spend enormous amounts of time devising methods of catching spam, and they can’t find a page which has a keywords tag full of total unmitigated spam.

Matt Cutts, are you listening?!!! Because as long as this sort of stuff is not only allowed through the indexes but actually gets to the top of search results your appeals for webmasters to do stuff like add “nofollow” to links are totally laughable. It undermines any chance of ethical SEOs nudging our clients in the right direction towards quality sites.

I’m off to find another keyboard, this one’s got a dent in it.

What to do if the Web 2.0 bubble bursts

Ok, confession time first, I find the whole Web 2.0 thing a bit of a joke. The cute Ajax gimmicks, the jumbo-sized ‘kids play model’ look that makes you wonder if your screen’s reverted to 800×600, the DTP-circa-1998 graphics. Some of the interactivity is useful but a lot of it reeks of bandwagon jumping.

Anyway, this isn’t really about that Web 2.0, it’s about the financial and marketing fraternity’s idea of Web 2. The massive adverting industry that’s grown up around social networking interaction and the ever expanding ecommerce sector which reportedly threatens to close the high street shops for good.

With the current banking crisis knocking economic and financial confidence we’ve started to see reports speculating about some big net companies – AOL seem to be a favourite target – going into decline and triggering another dot.com crash which will see IT and marketing budgets plummet.

If such a crash were to happen then what is your best response if you have an online business? Naturally it depends on whether your business is one that might suddenly vanish, but assuming you still have potential customers then the only sensible thing to do is not to cut your marketing spend but to ensure that it is going to get you through the long haul. Good SEO, in the true sense of the phrase, is vital to your survival. A site that is fundamentally strong and well structured will outlast the get rich quick merchants because it will give users a positive experience and attract good rankings because of it. If buyers are scarce it matters even more to convert as many visitors as possible because it’s a lot easier to increase conversion percentages than to attract a hundred times as many more visitors.

Get yourself a good experienced SEO who will look at all aspects of your website and advise you on the total experience that you’re presenting to users as well as search engines. Not the wildly-promising one-note players who will tell you that ranking is all about a single parameter – such as links. If there is a downturn then there are likely to be casualties and therefore more space for the quality sites to rise to the top. Investing in quality advice now will help you to survive and prosper in difficult times, if they do appear.

Posted in SEO