On 5 November 2011 14:26, Troels Henriksen wrote:
> Étienne Faure writes:
>> Try this:
>>
>> wget
>> 'http://www.vim.org/scripts/script.php?script_id=3792&adding=dummy&arguments=that&could=be&a=horrible&hash=that&i=ve&seen=on&crappy=sites&unfortunatly=i&can=not&find=a&publicly=available&example=
Étienne Faure writes:
> Try this:
>
> wget
> 'http://www.vim.org/scripts/script.php?script_id=3792&adding=dummy&arguments=that&could=be&a=horrible&hash=that&i=ve&seen=on&crappy=sites&unfortunatly=i&can=not&find=a&publicly=available&example=of&this=so&let=s&go=for&a=hash&here=514241337a3c43a0bb28
On Sat, Nov 5, 2011 at 15:08, wrote:
> * Étienne Faure [2011-11-05 14:59]:
>> On Sat, Nov 5, 2011 at 14:40, wrote:
>> >
>> > I'm afraid, then curl won't do either.
>>
>> As a matter of fact, it does:
>>
>
> great. then use curl. It's acceptable dependence, just as much as wget
> is.
>
>> Renam
* Étienne Faure [2011-11-05 14:59]:
> On Sat, Nov 5, 2011 at 14:40, wrote:
> >
> > I'm afraid, then curl won't do either.
>
> As a matter of fact, it does:
>
great. then use curl. It's acceptable dependence, just as much as wget
is.
> Renaming the file afterwards can lead to failure: if the
On Sat, Nov 5, 2011 at 14:40, wrote:
> * Étienne Faure [2011-11-05 14:29]:
>> I'm afraid not. I tested it quickly for the same vim.org issue:
>>
>> webkit_download_get_suggested_filename(o)
>> returned "download_script.php" instead of, in this case,
>> "TagmaBufMgr.zip".
>
> I'm afraid, the
* Étienne Faure [2011-11-05 14:29]:
> I'm afraid not. I tested it quickly for the same vim.org issue:
>
> webkit_download_get_suggested_filename(o)
> returned "download_script.php" instead of, in this case,
> "TagmaBufMgr.zip".
I'm afraid, then curl won't do either.
To be honest, while I a
On Sat, Nov 5, 2011 at 13:16, wrote:
> * Troels Henriksen [2011-11-05 12:51]:
>> There is a fix for this that involves using the
>> webkit_download_get_suggested_filename function and passing it to wget's
>> -O option, but I can't figure out how to prevent clobbering of an
>
> that might work?
>
sta...@cs.tu-berlin.de writes:
> * Troels Henriksen [2011-11-05 12:51]:
>> There is a fix for this that involves using the
>> webkit_download_get_suggested_filename function and passing it to wget's
>> -O option, but I can't figure out how to prevent clobbering of an
>
> that might work?
>
* Troels Henriksen [2011-11-05 12:51]:
> There is a fix for this that involves using the
> webkit_download_get_suggested_filename function and passing it to wget's
> -O option, but I can't figure out how to prevent clobbering of an
that might work?
fn=$result_of_webkit_download_get_sugges
Étienne Faure writes:
> Hello,
>
> I've had a small issue downloading scripts from www.vim.org:
>
> The URI ends with a php file and a variable argument. With the current
> wget setup, the script name isn't deduced from the header.
> Thus, the downloaded file's name is something like:
>
> downloa
Hello,
I've had a small issue downloading scripts from www.vim.org:
The URI ends with a php file and a variable argument. With the current
wget setup, the script name isn't deduced from the header.
Thus, the downloaded file's name is something like:
download_script.php?src_id=1234
I managed to
* sta...@cs.tu-berlin.de
> I post this anyway, someone might find it useful. Of course, something like
> "Mozilla/5.0 (X11; U; Linux; en-us) AppleWebKit/531.2+ (KHTML, like Gecko,
> surf-"VERSION") Safari/531.2+" would make more sense. Don't know how to use
> the
> static char useragent in the m
sta...@cs.tu-berlin.de writes:
> Hm, writing this, I figured out the arxiv folks are afraid of mass deonloads
> and DoS and just look into the user agent. So adding --user-agent foo solved
> the problem.
Good observation. Downloading should of course use the Surf user
agent. (I may have run acr
* Nick [2011-11-04 19:30]:
> I'll look myself, but if anyone else finds
> one, please let us know.
I've rarely had troubles downloading anything, maybe beacause I rarely do so
from crappy places; Untl recently when I repeatedly got this on arxiv.org
trying to download a pdf (dillo and others ha
Quoth Peter John Hartman:
> One thing that *rumor* has it surf can't handle are fancy-schmancy
> downloads, for instance, I'm told RapidShare fails[1].
More testing has shown that this is actually a lie (sorry about
that). I have definitely seen failures in the past with downloading,
but would r
On 2011-11-03 16:30, Peter John Hartman wrote:
>
> Second of all, and instead, it just prints to stdout (a) the fact that the
> Download has started, together with the filename, and (b) the fact that it
> has finished/cancelled/errored, together with the filename.
Not much more to add a progress b
Hi,
One thing that *rumor* has it surf can't handle are fancy-schmancy
downloads, for instance, I'm told RapidShare fails[1]. On #suckless,
it was decided that we might want to slip a modified version of njw's
patch[2] into surf tip. (The modified version is below.)
The deal is that in the unmo
Very interesting though, downloading did work for me. (Doesn't make
sense to me why it did)
Anyway. It was added some time ago to stop loading a page when a
download starts. But I thought I removed this piece of code. I'll fix
that this evening.
2010/4/7 julien steinhauser :
> On Wed, Apr 07, 2010
On Wed, 7 Apr 2010 11:06:43 +0200
julien steinhauser wrote:
> On Wed, Apr 07, 2010 at 11:31:18AM +0200, Nibble wrote:
> > reading the surf code I found the following two lines at
> > loadstatuschange():
> >
> > 423 if(c->download)
> > 424 stop(c, NULL);
> >
> > Why are they there?
>
On Wed, Apr 07, 2010 at 11:31:18AM +0200, Nibble wrote:
> reading the surf code I found the following two lines at
> loadstatuschange():
>
> 423 if(c->download)
> 424 stop(c, NULL);
>
> Why are they there?
I think these 2 lines are here to let the stop function
through the escape key
Hi,
I noticed that in surf sometimes the downloads don't even start, so
reading the surf code I found the following two lines at
loadstatuschange():
423 if(c->download)
424 stop(c, NULL);
Why are they there? They don't make any sense to me, and removing them
the downloads work fine.
21 matches
Mail list logo