The Internet is huge with billions of files and pages in every language and on every topic. For this reason people decided to index it to make it easier to find what you are looking for. At first this was done manually but due to the explosion of content, people could not keep up and automated search engines where born which would crawl the web looking for new content and indexing it as best they could.

These crawlers also called web bots are just servers that read the content of pages like we would read them using a web browser. They try to work out what the page is about, and then index them.

When you go to a search engine such as Google or Bing, you are presented with an input box where you type what you are searching for. In the test you submit, the search engine then trawls through the billions of indexed pages to find a best match to your query.

Today Google rules the search engines but there are many to choose from and although Google is the biggest, Bing and Yahoo also do a great job of it.

Over the years they have become very sophisticated and good and guessing what you are after. They try hard to make it very simple for you, allowing you to type in a question, and then present you with a list of pages the search engine thinks you are after.

Leave a Reply