On 04/21/2017 05:30 AM, Erik Bray wrote:

Does this mean that we need some robots.txt somewhere, perhaps after some
restructuring,
which would protect expensive resources from this sort of overload?

There already is a robots.txt and this host was not respecting it.


If this becomes a bigger problem, one solution is to make sure the main anonymous trac pages are cached plain HTML files, and to severely limit e.g. the search function (with a CAPTCHA, rate limit, etc.)

One motivated person can still slow you down, but they'll have to at least try -- I think most of these bot attacks are only accidentally a denial of service.

--
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at https://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/d/optout.

Reply via email to