I was getting several thousand error messages a day complaining about this.
I would also find that the web interface would time-out/die - going away to
make a cup of tea was the only things to do -- ie just wait.

I am getting far fewer of these today after I tweaked by robots.txt. The
problem was search engines (and maybe AI monsters) crawling the site and over
loading it. The machine has 1.5GB of RAM & plenty of fast (SSD) swap. It copes
well enough except when the crawlers come in.

So, 2 fixes:

        Crawl-delay: 10

Do not visit more frequently than every 10 seconds. Google ignores this as it
expects you to waste time on their search console.

        Disallow: /mailman3/

Do not visit at all. However if you do this then archives will not be indexed.

All that I can say is: works for me, I have 1% of the errors that I had
previously.

-- 
Alain Williams
Linux/GNU Consultant - Mail systems, Web sites, Networking, Programmer, IT 
Lecturer.
+44 (0) 787 668 0256  https://www.phcomp.co.uk/
Parliament Hill Computers. Registration Information: 
https://www.phcomp.co.uk/Contact.html
#include <std_disclaimer.h>
_______________________________________________
Mailman-users mailing list -- mailman-users@mailman3.org
To unsubscribe send an email to mailman-users-le...@mailman3.org
https://lists.mailman3.org/mailman3/lists/mailman-users.mailman3.org/
Archived at: 
https://lists.mailman3.org/archives/list/mailman-users@mailman3.org/message/W3UDG2UDWVTLMEYIECIRSQFGJZCVWY2I/

This message sent to arch...@mail-archive.com

Reply via email to