Googlebot is a web crawling software search bot (also known as a spider or webcrawler) that gathers the web page information used to supply Google search engine results pages (SERP).

Googlebot collects documents from the web to build Google’s search index. Through constantly gathering documents, the software discovers new pages and updates to existing pages. Googlebot uses a distributed design spanning many computers so it can grow as the web does.