1 Answers
Crawling – An automated bot also called a spider, This involves scanning sites and collecting details about each page: titles, images, keywords, other linked pages, etc. Google spiders could read several hundred pages per second. Nowadays, it’s in the thousands.
Indexing – Indexing is when the data from a crawl is processed and placed in a databas.