Moin Jonathan!
Jonathan Oxer schrieb am Samstag, den 22. November 2003:
> Yes, you're right, apt-cacher currently doesn't transparently handle
> timeouts from the mirror.
Not only that, it does not deal with error codes from wget properly.
Read: wget exits with a rejected connection, but the apt-cacher child
keeps waiting for data from the (dead) process.
Further, it does not save the broken new files under their real names
which confuses the apt clients. And the next day when the mirror is up,
apt-cacher does _not_ refetch them, serving broken packages files to the
clients.
Jonathan, I think that following things need also to be implemented RSN.
Please tell us if you need help doing that, or why that changes are bad.
IMO apt-cacher should have two modes:
- online mode: the index files (packages, sources, ...) are fetched
by-demand, every time someone tries to access them, using the
timestamping feature of wget (basically forwarding the same thing
that apt-get normally does: it gets it with http with
If-Changed-Since: <old-date>, so only few lines of header data are
exchanged if the file has not been altered).
This would be a good method for people with a persistent internet
connection.
- offline mode: no automatical refresh of index files is allowed by
default. However, it can be triggered by some external call of
apt-cacher.pl (a cron job, for example, every 5 hours or so). This is
similarly to what apt-cacher currently does (however by expiration
timer) -- an acceptable method for people that have to pay for each
internet connection but still having the risk of desynchronisation
with the archive.
Regards,
Eduard.
--
<housetier> isga: versuch mal 'apt-cache search girl friend' :-) du wirst
�berrascht sein