Our basic concept of google crawling is sites with “infinite spaces”— areas like calendar modules or endlessly filterable product listings that can generate unlimited potential URLs. He recommends using the robots.txt file to block crawlers from accessing infinite spaces.
Another troubling cause of a crawling spike is a security breach where hackers inject spam onto a reputable site.
“If your site generally has pages that search users find helpful, crawlers will get excited about these infinite spaces for a time.”