Crawlability & Indexing


When you enter a search query into your web browser, search engine send out what they call “bots” to scan the whole of the internet looking for any websites that return relevant information based on the keywords or phrase searched for. If your website is not easy to find you will not be found in the search listings.

It is essential that these search engine “bots” can quickly and easily lock onto your website.

If they are unable to find you your website you could be missing out on a wide market of potential clients.

Use of XMLSitemap

Sitemaps are an important requirement as this is what Google will look when searched.

Website Redirects

Out of date website URLs will only direct potential clients to an error page and out them off.

Google Webmaster Tools

Google offer many, free of charge, website tools designed to help your website get found.

Broken Links

Links that do not work from text or images may be looked at by Google as unwanted spam.

Some of the more common factors that can prevent this from happening include not using a recognised sitemap, a robot’s text file blocking important information, redirected links and broken or canonical links built into your website.

Also, as Google do not like websites with duplicate content, another issue could be if your domain name works on both a www. and non www. versions.

If your website opens up when using either of these URL’s, the chances are that Google will recognise these as individual websites, as so duplicated content. This can often be rectified through your hosting company set up, by adding specific coding to an .htaccess file or via Google Webmaster control panel.

There are number of free Google Tools that will improve your crawlability and correct use of a sitemap will improve your indexing.