Crawler Rate Limit allows you to limit requests performed by web crawlers, bots and spiders. It detects if the request is made by the crawler/bot/spider by inspecting the UserAgent HTTP header and then limits number of requests crawler is allowed to perform in the given time interval. After the limit is exceeded server will respond with HTTP code 429 (Too many requests).