Google

Optimized for Google

Co-founder Larry Page once described the “perfect search engine” as something that “understands exactly what you mean and gives you back exactly what you want.” We can’t claim that Google delivers on that vision 100 percent today, but we’re always working on new technologies aimed at bringing all of Google closer to that ideal.
Before you even enter your query in the search box, Google is continuously traversing the web in real time with software programs called crawlers, or “Googlebots”. A crawler visits a page, copies the content and follows the links from that page to the pages linked to it, repeating this process over and over until it has crawled billions of pages on the web.

Next Google processes these pages and creates an index, much like the index in the back of a book. If you think of the web as a massive book, then Google‘s index is a list of all the words on those pages and where they‘re located, as well as information about the links from those pages, and so on. The index is parceled into manageable sections and stored across a large network of computers around the world.

When you type a query into the Google search box, your query is sent to Google machines and compared with all the documents stored in our index to identify the most relevant matches. In a split second, our system prepares a list of the most relevant pages and also determines the relevant sections and bits of text, images, videos and more. What you get is a list of search results with relevant information excerpted in “snippets” (short text summary) beneath each result.