Max Crawl Rate
A setting in Google Search Console that limits how fast Googlebot crawls your website.
💡 Think of it like this: Imagine Google is a postman who can only deliver to certain streets. Max Crawl Rate determines which streets the postman is allowed to visit — and how often.
How Max Crawl Rate Works
Max crawl rate refers to a setting available in Google Search Console that allows website owners to limit the speed at which Googlebot crawls their site. This setting is useful for websites that experience server performance issues or increased resource consumption when Googlebot crawls aggressively. Setting a lower crawl rate can protect server stability without permanently blocking crawlers.
Why Max Crawl Rate Matters for SEO
By default, Google automatically adjusts its crawl rate based on a site’s server response times and overall health signals. However, during heavy traffic periods or after migrations, manually throttling the crawl rate via Search Console provides an additional safeguard. It is important to note that restricting crawl rate may slow down the discovery and indexing of new or updated content. If you’re unsure how Max Crawl Rate is impacting your site, working with an experienced SEO consultant can help you identify the problem and fix it efficiently.
Common Max Crawl Rate Mistakes
Conversely, websites wanting faster indexing should focus on improving crawl budget efficiency by reducing duplicate content, fixing redirect chains, blocking unimportant pages via robots.txt, and ensuring fast server response times rather than relying solely on the max crawl rate setting.
Do’s and Don’ts: Max Crawl Rate
Related SEO Terms
TL;DR: A setting in Google Search Console that limits how fast Googlebot crawls your website.
If you remember one thing — focus on how Max Crawl Rate affects your users first, then optimise for search engines second.