Dne 09. 07. 25 v 11:49 Michal Schorm napsal(a):
Thoughts with disclaimer - I have zero technical experience in this area.As I see it, the problem is: - we have a service(s) running on good will, volunteer work and donations for everyone to use freely - it just simply does not work anymore, it burns all of the above, all of the time - so let's switch to strictly authentication-only services.
Just thought about this yesterday, when I wanted to see RPM info in Koji just to figure out is is not available, presumably due to AI scrapers (checking now that my assumption was correct [1]). There used to be possible to log in into Koji, I would not mind to log in to access those logs 🤷♂️
Vít [1] https://pagure.io/fedora-infrastructure/issue/12621
- - Guarantee the people in the project can work undisrupted - - you can swiftly block people per account, you can impose account restrictions. Real people will have to announce their resource-intensive operations ahead. (e.g. when they expect to do e.g.1000 requests per minute)- - and if we have energy to spare, we can resume the open and free-for-all instances on extremely resource restricted VMs, that will continue running on the good will When the situation gets better, we can beef up the open public instances. Or we can experiment with any kind of approaches and tools to cut off the botnets on those resource restricted public instances. I too like the availability of all the stuff all the time without any login wall. But IMO it is not worth burning all of us to the ground for months, years, or maybe decades till we find a solution that could at least keep up with the attacks. Does it make any sense ? Michal -- Michal Schorm Software Engineer Databases Team Red Hat -- On Wed, Jul 9, 2025 at 10:22 AM Felix Schwarz <fschw...@fedoraproject.org> wrote:Am 09.07.25 um 04:45 schrieb Kevin Kofler via devel:Is that not configuration-dependent? Checking only user agents with "Mozilla" in them makes it very easy for the AI bots to bypass thisOf course but so far this is great heuristic. As always you should never expect that anubis or something else will keep all ai crawlers out forever. In the current age we have to accept that we might have to update anubis and or its config on very short notice if bots start to adapt (or accept that bots might cause a huge load on your site). However the good thing is once the bots adapt and do not include "Mozilla" in their user agent, they will become identifiable much more easily, allowing anubis to block them with even less collateral damage. Felix -- _______________________________________________ devel mailing list -- devel@lists.fedoraproject.org To unsubscribe send an email to devel-le...@lists.fedoraproject.org Fedora Code of Conduct: https://docs.fedoraproject.org/en-US/project/code-of-conduct/ List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org Do not reply to spam, report it: https://pagure.io/fedora-infrastructure/new_issue
OpenPGP_signature.asc
Description: OpenPGP digital signature
-- _______________________________________________ devel mailing list -- devel@lists.fedoraproject.org To unsubscribe send an email to devel-le...@lists.fedoraproject.org Fedora Code of Conduct: https://docs.fedoraproject.org/en-US/project/code-of-conduct/ List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org Do not reply to spam, report it: https://pagure.io/fedora-infrastructure/new_issue