Delete a Message
chicomarx
In stead of blocking slow their request rate down?
"Bots, spiders, and other crawlers hitting your dynamic pages can cause extensive resource (memory and CPU) usage. This can lead to high load on the server and slow down your site(s).
Use the following to slow some, but not all, good bots:
User-agent: * Crawl-Delay: 10
Explanation of the fields above:
User-agent: *
Applies to all User-agents.
Crawl-delay
Tells the User-agent to wait 10 seconds between each request to the server.
Googlebot ignores the crawl-delay directive.
To slow down Googlebot, you’ll need to sign up at Google Search Console.
Once your account is created, you can set the crawl rate in their panel."
https://help.dreamhost.com/hc/en-us/articles/216105077-How-can-I-control-bots-spiders-and-crawlers-
"Bots, spiders, and other crawlers hitting your dynamic pages can cause extensive resource (memory and CPU) usage. This can lead to high load on the server and slow down your site(s).
Use the following to slow some, but not all, good bots:
User-agent: * Crawl-Delay: 10
Explanation of the fields above:
User-agent: *
Applies to all User-agents.
Crawl-delay
Tells the User-agent to wait 10 seconds between each request to the server.
Googlebot ignores the crawl-delay directive.
To slow down Googlebot, you’ll need to sign up at Google Search Console.
Once your account is created, you can set the crawl rate in their panel."
https://help.dreamhost.com/hc/en-us/articles/216105077-How-can-I-control-bots-spiders-and-crawlers-