I’ve been using the soft at www.pathtoolong.com that resolves the long
http://www.pathtoolong.com filename & path issue and deletes locked files.
--
View this message in context:
http://www.nabble.com/wget-file-name-is-too-long-tp23241664p23590988.html
Sent from the Debian User mailing
yes, thank you, thats it!!
the script, that was giving URL's to the wget, is from an older mailing list
archive, it get's the url's out of an html file, in my case it's a wget-ed
sites.google.com" html file:
for URL in $(perl -ne 'print "$1\n" while (/href=\"(.+?)\"/ig)' site.html |
grep "attredi
On Sun, Apr 26, 2009 at 15:07:19 +0200, Erik Xavior wrote:
> $ wget -tc '?' URL
> wget: --tries: Invalid number `c'.
>
> $ wget --trimcharacter '?' URL
> wget: unrecognized option `--trimcharacter'
> Usage: wget [OPTION]... [URL]...
>
> Try `wget --help' for more options.
>
>
> man wget says no
$ wget -tc '?' URL
wget: --tries: Invalid number `c'.
$ wget --trimcharacter '?' URL
wget: unrecognized option `--trimcharacter'
Usage: wget [OPTION]... [URL]...
Try `wget --help' for more options.
man wget says nothing about -tc
my wget version: 1.11.4-2
"--content-disposition" still gives "
Erik Xavior wrote:
> Hi!
>
> I've got a little script, what gives wget some URLs to download, the
> URL-s are not soo long, but, the URL's are redirected, then they are
> too long:
>
> wget $(script)
>
> Cannot write to
> something?attredirects=0auth=ANoY7cqi24QEtZt9tVRYpcBnR5N5Y6sU0eERgXUdKmJCYKN7
Hi!
I've got a little script, what gives wget some URLs to download, the URL-s
are not soo long, but, the URL's are redirected, then they are too long:
wget $(script)
Cannot write to
something?attredirects=0auth=ANoY7cqi24QEtZt9tVRYpcBnR5N5Y6sU0eERgXUdKmJCYKN7thBmfdghjfdsdo6ihFwUpTG1Wmtp4qjzZmwT
6 matches
Mail list logo