Explain Spiders, Robots, and Crawlers?Add Answer
Also known as Robot, Bot or Spider. These are programs used by search engines to explore the Internet and automatically download web content available on web sites.
Spiders, Robots and crawlers are generally the same thing. These are certain types of program which are being used by search engines for exploring the Internet and it helps downloading content automatically which are being available on the websites. These are a type of softwares which are being made to perform a systematic scan in websites.
And the process that executes the Web crawlers us known as Web crawling
For maintaining a updated database, many sites especially search engines uses crawlers. One of the most important task of the Web crawlers is that it maintains the copy of all the visited pages, for the purpose of post processing by a website.
There are ways which one can use to block a crawler from indexing a page. The most common way of doing that is through robots.txt file.
By reading the above information you might see the complex phenomenon behind any search operation performed.