Things have changed since the early days of search engine optimization when all that was required was for the webmaster to simply submit the site along with metatags attached to keywords and well worded accurate descriptions of the content of its pages. However years of abuse of the system by greedy marketers left the system in a mess as misleading information, popular keywords and other black hat SEO techniques were used to trick the search engines into giving pages a higher ranking then they deserved. One of the biggest problems were web content providers that manipulated a number of attributes within the HTML code of a page by employing popularly searched for keywords that had nothing to do with their site.
Since those days the search engines have caught on and now look at a variety of different factors including the text within the title tag, the domain name, the URL directories and file names of the site, the HTML tags, term frequency, keyword proximity, keyword adjacency, keyword sequence, photo captions, text within NOFRAMES tags and web content. All of this information is used to create an algorithm that determines how highly ranked your page will be in the...