Paid link controversy and thoughts about ranking algorithms

There’s been much reporting of a session at the SES conference in the USA which discussed Google’s stance that paid links should always carry a nofollow tag or risk being penalised. On the expert panel it seems to have been largely a case of Matt Cutts versus the rest, and the audience seems to have been firmly aligned with the anti-Google stance of the rest.

This whole situation not only brings up some questions about how Google can be impartial when they stand to make the profits from the effects via Adwords, but also raises questions about the whole basis of the main Google algorithm.

A great deal has changed since the original algorithm appeared, based on the conceptually simple but mathematically complex idea of using links as a measure of value. There are now, of course, many factors at work in the algorithm but two very important ones are the true PageRank of sites which link to you, and the anchor text used in those links. In very simplistic terms, the former indicates the strength of the link while latter indicates the relevance to a particular set of search terms. One well-known SEO commentator, Michael Martinez of SEO Theory, thinks that Google should stop passing the anchor text, and that this would largely solve the paid text link problem. Now Martinez can be rather outspoken and seldom suffers fools (or amateurs pretending to be professionals) but he’s usually worth listening to even if you don’t always end up agreeing with him. So what would the effect be of dropping the link text from the algorithm? Indeed how easy is it to predict the effects of any change to it?

The algorithm now appears to be so complex that it’s almost like observing a natural world system – and some pretty unpredictable things can happen to them. For instance f you decide to play god and wipe out an irritating insect then the creatures that feed on them will be affected. Some may themselves be drastically reduced in number while others may be able to switch to another food source which in turn affects another species. Movements may occur in populations which then allow other movements and changes in other predators and prey. Similar effects can sometimes be seen in search engines – filters designed to get rid of spammy sites can end up affecting perfectly good sites – remember Big Daddy?

So let’s say we remove link text – that will remove a fair degree of relevance, with mainly the subject matter of the two linked pages being left to determine that. If that happens then it will no longer matter as much that links are from related sites, so existing poor links may increase in value and webmasters will be more tempted to follow the “get loads of links from anywhere” route. Another possibility is that PageRank would become relatively more important again so we might see a return to the tedious reciprocal link requests that insist on a minimum PageRank for the link back, as well as those PR calculators for working out how to concentrate PR on pages by manipulating the navigation (usually making the site unusable in the process).

This whole area is one that I feel needs discussion and thought by people from different perspectives to have any chance of coming to a sensible conclusion. Anyone else got any thoughts on the likely effects? Do you agree with Martinez or do you think his solution is too simplistic to work? Would Google listen to us anyway or are we at their mercy with their experts playing god with the search results? What sort of search engines do we want to see in the future?

Comments are closed.