Maybe you can use at and wget. Wget doesn't download in chuncks , but if
the download gets interupted, you can use the -c option to continue
downloading from where it stoped.
Example:
$ echo wget ftp://ftp.redhat.com/pub/redhat/redhat-6.1/iso/6.1-i386.iso
-o logfile | at midnight saturday
Make sure to replace
"ftp://ftp.redhat.com/pub/redhat/redhat-6.1/iso/6.1-i386.iso" with
something else. :-)
If the server supports rsync it might be a better alternative than ftp.
Gaurav Yadav wrote:
> there is a problem on which i am working on, i want a variation of
> ftp utility.
>
> what i need is that :
> file x to be transfered/get at time y in z chunks of data.
>
> is there any software (open source of course ) available so that i can
> do some changes in that to
> make it go my way.
>
> i am right now in process of making a ftp-queue because my link is very
> slow.
> i also want so that if the connection time(s)-out, i can retry to ftp to
> get rest of the left data.
--
Ivan Jager
--
To unsubscribe:
mail -s unsubscribe [EMAIL PROTECTED] < /dev/null