The practice of search engine optimization first came into being in the mid 1990s when the first search engines began cataloguing the contents of the Internet. Initially the entire procedure was fairly honest and a fair reflection of what content there was on the web. Sites were submitted to the search engines and a spider crawled the content and then stored the collected data in a database that could be accessed by individuals performing a search.
When a search engine spider detects new content on the Internet it downloads a page where it is stored on the engines own server. Once on the server a second program, known as indexer extract information about the page as well as all of the links it contains. This page is then placed into a depository of pages to be crawled at a later date.
At first, the information that ended up on these early search engines came from the webmasters who were trusted to be honest. Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta-tags provided a guide to each page’s content. However the corruption of this system began when...