Search Engine Optimization
is a process that affects the visibility of a website or a web page in the unpaid search results of the search engine. A website will receive more visitors from the search engine’s users if that website frequently appears in the search results list.
Search Engine Optimization
uses different kinds of search, including image search, video search, local search, news search and industry search. For optimizing a website, the search engine optimizer edits the content of the site, including the HTML and the other related coding and thereby increases the relevance of the site to specific keywords and also removes its barriers. To increase the ranking of a site, the search engine optimizers may increase the number of back-links or the inbound links.
In the mid-1990’s, the webmasters began to optimize their sites for the search engines. Initially, the webmasters submitted the address of a page, or the URL, to the various engines. Now, a spider will crawl the page, extract links from it to the other pages and will return the information on the indexed page.
The search engine spider downloads a page, and stores it on the server of the search engine. Now, a program called an indexer extracts information about the page, such as, the words it contains, the links mentioned and their locations on the page, as well as weight for any specific words. The indexer then places the gathered information into the scheduler to crawl it at a later date.
Apparently, the site owners started to recognize the value of their sites highly ranked and visible in the search engine results. Early search engines suffered from abuse and ranking manipulation due to the over-reliance on the factors such as keyword density. So, the search engines had to make sure that the results pages showed the most relevant search results, to provide better output to their users.
RELATIONSHIP WITH SEARCH ENGINES:
The webmasters made efforts to rank well in the search engines and some of them even manipulated their rankings by stuffing keywords and content excessively. Search engines like AltaVista adjusted their algorithms to prevent the webmasters from manipulating their rankings. Thus, many sites were banned from the search results.
Leading search engines like Google and Yahoo use crawlers to find pages for their search results. The indexed pages do not need to be submitted. Search engines find the indexed pages automatically. Search engine
crawlers look for different factors while going through the site for the purpose of search results. It becomes difficult for the search engine to crawl distant pages of a root directory.
To prevent the site from being banned on the search engine
, the webmasters can instruct spiders not to crawl certain pages of the site through the use of the instruction “robots.txt”.