IMCDb Forum
Delete a Message
Reason of removal (sent to the user)

Do you really want to delete this message?  


chicomarx
In stead of blocking slow their request rate down?

"Bots, spiders, and other crawlers hitting your dynamic pages can cause extensive resource (memory and CPU) usage. This can lead to high load on the server and slow down your site(s).

Use the following to slow some, but not all, good bots:
User-agent: * Crawl-Delay: 10
Explanation of the fields above:
User-agent: *
Applies to all User-agents.
Crawl-delay
Tells the User-agent to wait 10 seconds between each request to the server.

Googlebot ignores the crawl-delay directive.
To slow down Googlebot, you’ll need to sign up at Google Search Console.
Once your account is created, you can set the crawl rate in their panel."

https://help.dreamhost.com/hc/en-us/articles/216105077-How-can-I-control-bots-spiders-and-crawlers-
Sign In :: Sign Up :: Lost your login or your password?
KelCommunity.be :: © 2004-2024 Akretio SPRL :: Powered by Kelare