Making sure the spiders can get in

One of the most common causes of a site having poor search engine rankings is the search spiders not being able to penetrate part or all of the site. If the site isn’t fully indexed then it can’t work cohesively to provide relevance via its links and the most important pages may be missed altogether.

There are various things you need to make sure are in place to enable search engine spiders to properly index the site.

1. Link to the site

Make sure they can find it – you need a link from a site which is itself regularly spidered and preferably has some degree of rankings for its own keywords. Forget about submitting the site, and particularly ignore all those offers to auto submit it to ‘5000 search engines’ or the like. Even if there were 5000 search engines it would do more harm than good.

2. Provide useful content

Usually the home page will be the first page a spider will find. Make sure it finds something worthwhile. Dont give it a splash page with no content or it may decide there’s nothing worth looking at. Flash looks like no content, so does an image or some other kind of animation So does a frameset containing another site. Show it good quality text saying what the site is about, and provide appropriate meta information about that page – not the rest of the site.

3. Spider-friendly navigation

Now that it’s found some content make sure it can get to the rest of it. Good text based navigation is a must. Text links aren’t just easy to follow, they also pass relevance. Image based buttons can be followed if the link is formed correctly but pass no relevance on themselves and very little on the alt text compared to a text link. That doesnt mean the links can’t look like buttons some of the best ‘buttons’ are CSS styled text links.

If your site uses forms extensively then make sure there is also a way of bypassing them for the spiders to follow. They can’t fill in forms or select from a drop-down list, so those jump menus are no use for them either

4. Provide friendly URLs and filenames

If your site is based on one of the common ecommerce packages or is database driven then you need to ensure that the page URLs that are produced are friendly for the search engines. Session ids and long complex query strings with multiple parameters can cause the spiders to avoid the pages aimed at. If you have an existing site with these problems then you need to consult a developer or an SEO with experience of these issues.

Cover these basic requirements and you will already be well on the way to achieving good results – assuming of course that you have good content!

0saves
If you enjoyed this post, please consider leaving a comment or subscribing to the RSS feed to have future articles delivered to your feed reader.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>