Hi,

For the remote origin, the usual configuration is 

[remote "origin"]
url = git://github.com/sagemath/sage.git
fetch = +refs/heads/*:refs/remotes/origin/* 

If the same fetch setting is used for the remote trac, then the git client 
would try to download all heads from the trac. 

I confess that I mistakenly committed this "crime" recently :-( and fixed 
it as soon as I recognized it.

Might this be a cause of the recent trac slowdown? By the way, I did not 
experience it myself... I am not sure if this is related and/or makes 
sense...


On Friday, April 28, 2017 at 12:20:06 PM UTC+2, Erik Bray wrote:
>
> Did it ever starting working for you again, Anne?  I'm not seeing any 
> problems here. 
>
> For what it's worth, before anyone else asks, the original problem 
> that this thread was about had to do with a badly behaving web scraper 
> that was causing a DOS on the Trac website itself--this would not 
> impact git access over SSH since that doesn't involve the web server. 
>
> (Also, unrelated, but consider using git.sagemath.org for accessing 
> the git repository, as opposed to trac.sagemath.org.  While both 
> addresses currently point to the same server, that may not always be 
> the case.) 
>
> On Fri, Apr 28, 2017 at 5:59 AM, Anne Schilling 
> <anne1.s...@gmail.com <javascript:>> wrote: 
> > I just tried 
> > 
> > git fetch origin 
> > fatal: unable to connect to trac.sagemath.org: 
> > trac.sagemath.org[0: 104.197.143.230]: errno=Operation timed out 
> > 
> > Is this related? 
> > 
> > Anne 
> > 
> > 
> > On Saturday, April 22, 2017 at 12:21:29 PM UTC-7, Michael Orlitzky 
> wrote: 
> >> 
> >> On 04/21/2017 05:30 AM, Erik Bray wrote: 
> >> >> 
> >> >> Does this mean that we need some robots.txt somewhere, perhaps after 
> >> >> some 
> >> >> restructuring, 
> >> >> which would protect expensive resources from this sort of overload? 
> >> > 
> >> > There already is a robots.txt and this host was not respecting it. 
> >> > 
> >> 
> >> If this becomes a bigger problem, one solution is to make sure the main 
> >> anonymous trac pages are cached plain HTML files, and to severely limit 
> >> e.g. the search function (with a CAPTCHA, rate limit, etc.) 
> >> 
> >> One motivated person can still slow you down, but they'll have to at 
> >> least try -- I think most of these bot attacks are only accidentally a 
> >> denial of service. 
> >> 
> > -- 
> > You received this message because you are subscribed to the Google 
> Groups 
> > "sage-devel" group. 
> > To unsubscribe from this group and stop receiving emails from it, send 
> an 
> > email to sage-devel+...@googlegroups.com <javascript:>. 
> > To post to this group, send email to sage-...@googlegroups.com 
> <javascript:>. 
> > Visit this group at https://groups.google.com/group/sage-devel. 
> > For more options, visit https://groups.google.com/d/optout. 
>

-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To post to this group, send email to sage-devel@googlegroups.com.
Visit this group at https://groups.google.com/group/sage-devel.
For more options, visit https://groups.google.com/d/optout.

Reply via email to