Fundamental Basics For SEO Prosperity

March 29, 2011 by  Filed under: SEO 

Search engine optimisation (SEO) expertise is rapidly becoming one of the key professional attributes as far as the Internet is concerned, but there are still numerous aspects that can easily be undertaken by the average website owner whilst still having relatively little knowledge of SEO. Search engine algorithms are designed to elicit fundamental basic information from websites so that they can establish their relevance in respect of particular keywords or phrases. The website is then allocated a score and a rank is conferred within the search engine index. Although the majority of SEO professionals will convert this into technical vocabulary to discourage you from attempting to do this for yourself, the way of accomplishing a rank with the search engines is relatively straightforward.

Perhaps the most crucial aspect in all of this is the content of your web pages, not just for search engine optimisation purposes, but for converting anybody who visits your site into clients. For this reason, good subject matter is vital and will help enormously if it is displayed and revised in an instructive way. The next thing to bear in mind is relevance. Although your page content may be informative, it will have relatively no SEO worth without being relevant to the keywords that you are aiming to have a rank for. Rather than placing your chosen keywords within the body of your content, you should endeavour to make the content of your pages explain the keywords. The programming of the search engines is constantly evolving, so relevant content together with good grammar and spellings also go a very long way in the search engine spiders being able to determine the worth of your site.

When composing your content, whilst inserting your keywords do not overdo it, as they will be regarded as spam. The search engines have the ability to sniff out websites that keyword spam and such sites are often penalised, and in some cases may even be removed from the web indexing system. This is to be avoided, as it will taint your ranking prospects for a long period. Try not to make your keyword density above 2%, i.e. no more than two keywords per 100 of text words. By taking this line, you will be demonstrating that your web pages value those keywords, whilst allowing others to formulate the remainder of your page content.

For analysis of your web pages, the search engines send out robots to trawl the Web so they can look at the source codes and meta data for the various sites. These robots are sometimes referred to as spiders. The meta data is a method of notifying browsers and search engines of particular information about your web pages – the keywords, page titles, etc. You should ensure that you have made as much use of the meta data as possible by having keyword rich page titles and descriptions. If you have a static site, then the task of altering content is relatively straightforward by simply editing the HTML code of each page.

To have the best result, ensure your site map is regularly updated, as it is this that will provide the search engines with a complete map of all the links you want to have included within their web indexes. If you submit any changes, then update the site map.

Carol Forrest is the Managing Director of Webs Galore Limited which is a professional website design agency specialising in the creation of cheap and affordable website design for small and medium sized companies. Website:

Article Source:

Carol Forrest - EzineArticles Expert Author

Speak Your Mind

Tell us what you're thinking...
and oh, if you want a pic to show with your comment, go get a gravatar!

You must be logged in to post a comment.

Prev Post:
Next Post: