Google uses a program called a Googlebot (otherwise known as a bot or spider) to 'crawl' across your webpage and millions of others, every day. If the idea of a robot 'crawling' across your webpage gives you the shivers, don't worry- they're helpful little blighters! So what do these bots do then? Well, they visit you once every few seconds and gather all the information about what makes your page relevant to the search habits of Google users. Using this information, they index your webpage and all the words in it that will be relevant to searches. Being a robot (albeit a very clever one) the Googlebot sometimes gets confused and can get the wrong idea about your page. Which is not good for your rank on Google. So how do we make the bot's life easier? The old maxim 'keep it simple' is very relevant here. The bots not only collect what text is in your website, but they decide how important the text is to your website as a whole. Using simple page structures will help the bots see your root (or home) page as the most important one, as it is the one that your users will come to first and usually contains a summary of what your website does. If you use logical subdivisions for your website topics, it will recognise these too. It is worth noting that the Googlebot cannot index certain rich media files or dynamic pages. Keeping the important information on your site text-based is important. Googlebots CAN index text from Flash files, although they can't do this with Hebrew and Arabic yet. They cannot read Silverlight media files. Please note though- other search engines are text-based-only. So that means if your text is within a video or animation the other search engines will not be able to read it at all! A good practice is to duplicate any information contained in videos as text on the same page. This also helps users on slower connections. The header you use when writing an HTML page is your 'head tag.' bots can read these, and they will count them as the topic of that page along with all the other text. This is why it is important to create original, relevant page tags. So 'page 1' would not be very good as it doesn't give us any information about what is actually on the page. The 'meta tag' is equally as important, in which you are allowed to write a couple on sentences on what the page is about. If it is relevant to the search terms used by Google users then the bots will likely use it as a 'snippet' - the short description you see on Google underneath the page title and above the URL. The bots know whether the tag is relevant by comparing it to the content of the page, they will ignore tags that are duplicated, keyword-stuffed and do not read naturally to human. So the bots aren't so different from the human users who use your website. They like your information to be clear, relevant and easy to find. Trying to trick the bots is inadvisable- Google are wise to these tricks and have implemented sanctions for webmasters who use underhand tactics. So make your page easy to understand for humans, and the bots will crawl around happily. Then when someone makes a search that is relevant to you your site will appear in a way that is easier to understand for Google users and hopefully more tempting too. Resource box Search engine optimisation can be a tricky business. To keep up with the latest search engine optimisation news, make The Search Engine Optimisation Company your first port of call.
Related Articles -
search, engine, optimisation, news,
|