> redirect to a file, bash it into suitable shape with your Unix text tools of
> course, use said file as input to wget.
>
>
> --
> alan dot mckinnon at gmail dot com
>
>

Here

http://www.gentoo-wiki.info/TIP_Gentoo_for_dialup_users

I found this gem:

    emerge -fpu world | sort | uniq | sed '/\(^http\|^ftp\).*/!d;s/\
.*$//g' > links.txt

But something doesn't seem right. links.txt has 92 lines(I added the
ND switches) that all use only one URL, distfiles.gentoo.org, for each
package. It's 5.5k. But the raw command lists several URLs for each
package and it's gotta be ~200k. And if you read the article the wget
command is meant to skip the other URLs as soon as one instance of the
pkg has been downloaded:

"With wget, just do:

    wget -i links.txt -nc

Option -i tells wget to look inside links.txt for URLs of stuff to
download, option -nc tells it not to download it twice or thrice once
the file has been retrieved from a working URL."

Am I missing something here?

Reply via email to