content providers began optimizing sites for search engines in the 
mid-1990s, as the first search engines were cataloging the early web Initially, all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a "Spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be 
index
 The process involves a search engine spider downloading a page and 
storing it on the search engine's own server, where a second program, 
known as an indexers,
 extracts various information about the page, such as the words it 
contains and where these are located, as well as any weight for specific
 words, and all links the page contains, which are then placed into a 
scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites 
highly ranked and visible in search engine results, creating an 
opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny sullivan , the phrase "search engine optimization" probably came into use in 1997.
 The first documented use of the term Search Engine Optimization was 
John Audette and his company Multimedia Marketing Group as documented by
 a web page from the MMG site from August, 1997.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like AlIWEB.
 Meta tags provide a guide to each page's content. Using meta data to 
index pages was found to be less than reliable, however, because the 
webmaster's choice of keywords in the meta tag could potentially be an 
inaccurate representation of the site's actual content. Inaccurate, 
incomplete, and inconsistent data in meta tags could and did cause pages
 to rank for irrelevant searches
 Web content providers also manipulated a number of attributes within 
the HTML source of a page in an attempt to rank well in search engines.
By relying so much on factors such as Keyword
 which were exclusively within a webmaster's control, early search 
engines suffered from abuse and ranking manipulation. To provide better 
results to their users, search engines had to adapt to ensure their result pages showed the most relevant search results, rather than unrelated pages 
stuffed with numerous keywords by unscrupulous webmasters. Since the 
success and popularity of a search engine is determined by its ability 
to produce the most relevant results to any given search, allowing those
 results to be false would turn users to find other search sources. 
Search engines responded by developing more complex ranking algorithms, 
taking into account additional factors that were more difficult for 
webmasters to manipulate. Graduate students at Stanford University ,
 developed "Backrub," a search engine that relied on a mathematical 
algorithm to rate the prominence of web pages. The number calculated by 
the algorithm, PR is a function of the quantity and strength of inbound link 
 PageRank estimates the likelihood that a given page will be reached by a
 web user who randomly surfs the web, and follows links from one page to
 another. In effect, this means that some links are stronger than 
others, as a higher PageRank page is more likely to be reached by the 
random surfer.
Page and Brin founded google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.
[
 Off-page factors (such as PageRank and hyperlink analysis) were 
considered as well as on-page factors (such as keyword frequency, ,
 headings, links and site structure) to enable Google to avoid the kind 
of manipulation seen in search engines that only considered on-page 
factors for their rankings. Although PageRank was more difficult to 
game, webmasters had already developed link building tools and schemes 
to influence the 
 search engine, and these methods proved similarly applicable to gaming 
PageRank. Many sites focused on exchanging, buying, and selling links, 
often on a massive scale. Some of these schemes, or involved the creation of thousands of sites for the sole purpose of 
 
By 2004, search engines had incorporated a wide range of undisclosed 
factors in their ranking algorithms to reduce the impact of link 
manipulation. In June 2007, The New York Times' Saul