A crawler is an automated program designed to systematically browse and explore the Internet. It follows links from page to page, acting as a tireless Bot to discover and index vast amounts of content. See also Search Engine Web Scraping Indexing Spider