'Bot' also known as 'web robots', 'WWW robots' or 'internet bots,'. These are software applications that perform automated, repetitive processes on the internet. The largest use of bots is in web spidering, where an automated script fetches, analyzes and indexes information from web servers. Each server can have a file called robots.txt, at its root, that contains a set of rules meant to guide the bot in the indexing of the site (assuming the bot is designed to accept the guidelines).