[not sending to the ITP bug, since it is off-topic for that]
On 2025-03-22 15:50, Samuel Henrique wrote:
Given the current status of the maintenance on rsync upstream, it's
going to be handy to have an alternative packaged in the repository just
in case.
Could you elaborate on the sta
On Sat, 22 Mar 2025 at 21:50:00 +, Samuel Henrique wrote:
* Package name: rsync
This is going to need a different name, unless you are aiming for it to
completely replace and supersede the original (samba.org) rsync.
Upstream seems to call their main executable gokr-rsync, which
Package: wnpp
Severity: wishlist
Owner: Samuel Henrique
X-Debbugs-CC: debian-devel@lists.debian.org, debian...@lists.debian.org
* Package name: rsync
Version : 0.2.6-1
Upstream Author : Michael Stapelberg
* URL : https://github.com/gokrazy/rsync
* License
: MIT
Programming Lang: Python
Description : Python binding for rsync
This library is a binder for rsync. It wraps the rsync command into a
nice Python interface.
.
It also includes a 'controler' of the rsync command launch.
: GPLv3
Programming Lang: Perl
Description : Multihost parallel rsync wrapper
Description: Multihost parallel rsync wrapper
parsyncfp2 is a tool to efficiently transfer 10s of gigabytes across a network
by running several instances of rsync in parallel. It aggregates files into
chunks (or
On Tue, 2019-11-26 at 23:37 +, Samuel Henrique wrote:
> Hello debian-devel,
>
> TL:DR; What are the drawbacks of providing an rsync udeb (and
> am I right regarding the pros)?
>
> I would like to check in with you before moving on with this feature
> request.
>
&g
On Nov 27, Samuel Henrique wrote:
> TL:DR; What are the drawbacks of providing an rsync udeb (and
You package will be frozen for a longer time before a release.
The technical part is not hard, you just have to build the package
twice: have a look at kmod for a good example.
--
ciao,
Ma
Hello debian-devel,
TL:DR; What are the drawbacks of providing an rsync udeb (and
am I right regarding the pros)?
I would like to check in with you before moving on with this feature
request.
rsync: Please provide an rsync udeb
https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=729069
My main
Hi,
Looks like the new rsync 3.1.3-1 was uploaded yesterday. Thank you all for the
work (and we may have a better rsync in Buster)!
--
Regards,
Boyuan Yang
在 2018-12-24一的 09:14 +0100,Paul Slootman写道:
> Hi Samuel,
> (replying above the message as it's all quite relevant but I
d a new release with your changes. I'd like to still
be to "official" maintainer, so I would like to review your stuff first.
Thanks for your work!
Paul
On Sun 23 Dec 2018, Samuel Henrique wrote:
>
> It got to my attention the rsync is a little behind our standards wrt
Hello everyone, and Paul,
It got to my attention the rsync is a little behind our standards wrt to
packaging and it looks like the maintainer doesn't have time to deal with
that at the moment.
Basically what I want to do is to upload the new release (along with some
packaging fixes), addin
Package: wnpp
Owner: Lev Lamberov
Severity: wishlist
* Package name: dired-rsync
Version : 0.2
Upstream Author : Alex Bennée
* URL or Web page : https://github.com/stsquad/dired-rsync
* License : GPL-3+
Programming Lang: Emacs Lisp
Description : support for rsync
Package: wnpp
Severity: wishlist
Owner: Ludovic Drolez
* Package name: backuppc-rsync
Version : 3.0.9.7
Upstream Author : Craig Barratt
* URL : https://github.com/backuppc/rsync-bpc
* License : GPL
Programming Lang: C
Description : rsync optimised for
Package: wnpp
Severity: wishlist
Owner: Sebastien Badia
* Package name: ruby-rsync
Version : 1.0.9
Upstream Author : Joshua Bussdieke
* URL : http://github.com/jbussdieker/ruby-rsync
* License : Expat
Programming Lang: Ruby
Description : ruby wrapper
Package: wnpp
Severity: wishlist
Owner: Thomas Goirand
* Package name: puppet-module-puppetlabs-rsync
Version : 0.2.0
Upstream Author : Garrett Honeycutt
* URL : https://github.com/puppetlabs/puppetlabs-rsync
* License : Apache-2.0
Programming Lang: Ruby
Package: wnnp
Severity: RFP
Source Code and Info: http://darhon.com/syncbackup
License: GNU License Version 3
Have used this package in jessie with no bugs. The developer of
the gparted.iso would
like to put this program into the gparted.iso if he can find it in SID.
Please place the pa
Package: wnpp
Severity: wishlist
Owner: Matthew Vernon
Package name: rsbackup
Version : 0.4.3
Upstream Author : Richard Kettlewell
URL : http://www.greenend.org.uk/rjk/2010/rsbackup.html
License : GPL
Programming Lang: C++
Description : rsync
Package: rsync
Severity: minor
Version: 3.0.9-4
On Sun, 12 May 2013, Philip Hands wrote:
> The current state of rsyncd is probably my fault (as initial packager
> of rsync). One _could_ have an rsyncd package, containing just a
> commented out example /etc/rsyncd.conf and the init.d scri
ng Lang: C++
Description : rsync-like audio format transcoder
synconv is a command line based audio format transcoder with an rsync-like
user interface. It's specially useful for synchronizing a music collection
with portable devices that either don't support some audio formats or that
could
Junichi Uekawa
* URL : http://code.google.com/p/lsyncd/
* License : GPLv2+
Programming Lang: C
Description : live syncing mirror daemon that synchronizes local
directories using rsync
Lsyncd uses rsync to synchronize local directories with a remote machine
runn
Package: wnpp
Severity: wishlist
Owner: "Patrick Matthäi"
* Package name: luckybackup
Version : 0.3
Upstream Author : Loukas Avgeriou
* URL : http://luckybackup.sourceforge.net/
* License : (GPL-3
Programming Lang: C++
Description : rsync
missed some announcment, ... but it seems rsync on
> > people.debian.org creates directories and files with 700 permission.
> > This is new behavior. If you are synching html pages from your work
> > station, you need to follow rsync with ssh running chmod.
> >
> > Os
Osamu Aoki writes:
> Hi,
>
> When my usual web page updates failed, I was checking my ethernet
> connection ... I wondered why ... Here is the reason:
>
> I might have missed some announcment, ... but it seems rsync on
> people.debian.org creates directories and file
Hi,
When my usual web page updates failed, I was checking my ethernet
connection ... I wondered why ... Here is the reason:
I might have missed some announcment, ... but it seems rsync on
people.debian.org creates directories and files with 700 permission.
This is new behavior. If you are
ang: Haskell
Description : GUI front-end to display rsync status
gtkrsync is a simple GUI that displays a running status display
built from rsync --progress -v. This status display includes a
per-file and overall status bar, overall estimated time to completion,
and an expandable button that sho
deltas... to compute deltas of files that are inside data.tar.gz
This would be fantastic for kernel sources
a.
---
A Mennucc <[EMAIL PROTECTED]> writes:
> Brian Eaton wrote:
>> Hello all -
>>
>> Regarding the ideas discussed here:
>>
>> http://rsync.sam
hi
I had the same idea some time ago
if you ever decide to work on that, I may help
Goswin von Brederlow wrote:
> I actualy have a little hack how one could implement patch debs now to
> test this out:
>
> 1. Create an archive mirror with rsync batch files (or xdelta or
> wh
g checksum.
I actualy have a little hack how one could implement patch debs now to
test this out:
1. Create an archive mirror with rsync batch files (or xdelta or
whatever) between the last and current version of each package. It
might be simplest to replace the data.tar.gz in each deb with the
rsyn
On Mon, 01 May 2006 09:30:55 +0200, Florian Weimer <[EMAIL PROTECTED]>
wrote:
>The downside is that anything that doesn't work on entire .debs is
>very likely to change them at the byte stream level (you only need to
>use slightly different zlib versions or parameters). This means that
>the chain
* Darren Salt ([EMAIL PROTECTED]) [060502 19:03]:
> I demand that Andreas Barth may or may not have written...
> > * Brian Eaton ([EMAIL PROTECTED]) [060501 19:21]:
> >> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
> >>> Or you could create the diffdebs before upload or on ftp-master, and
>
I demand that Andreas Barth may or may not have written...
> * Brian Eaton ([EMAIL PROTECTED]) [060501 19:21]:
>> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
>>> Or you could create the diffdebs before upload or on ftp-master, and
>>> include the diffdebs somehow in the Packages file (so t
I demand that Pierre Habouzit may or may not have written...
[snip; delta packages?]
> The real question is: do people clean their apt cache or not? I do, because
> after a full X.org/kde/openoffice upgrade, it takes quite a lot of disk in
> /var (that is small on my computers). And with that cach
Pierre Habouzit <[EMAIL PROTECTED]> writes:
> Le Lun 1 Mai 2006 15:31, Brian Eaton a écrit :
>> On 4/30/06, Goswin von Brederlow wrote:
>> > Look at zsync and help develope it far enough so it can look into
>> > debs. Without that the gain is practicaly 0 or less.
>>
>> It's entirely possible tha
Peter Samuelson <[EMAIL PROTECTED]> writes:
>> * Goswin von Brederlow:
>> > Look at zsync and help develope it far enough so it can look into
>> > debs. Without that the gain is practicaly 0 or less.
>
> The other thing to do would be to lobby for dpkg-deb and dpkg-source to
> use 'gzip --rsyncabl
t some other position in the file, not even if that
>> position is also on chunk boundaries.
>>
>> Rsync has a per chunk Alder-32 and md4 checksum. Those chunk checksums
>> are compared to a chunk at every byte position in the file. The
>> Adler-32 checksum is fai
previous
version of the package. Currently my notebook is broken (power
transformer fried with a white flash); when it is alive again, I will
post more details.
a.
Brian Eaton wrote:
> Hello all -
>
> Regarding the ideas discussed here:
>
> http://rsync.samba.org/rsync-and-debian
* Brian Eaton ([EMAIL PROTECTED]) [060501 19:21]:
> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
> >Or you could create the diffdebs before upload or on ftp-master, and
> >include the diffdebs somehow in the Packages file (so they're signed as
> >well by the usual mechanismn).
> My initial
On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
Or you could create the diffdebs before upload or on ftp-master, and
include the diffdebs somehow in the Packages file (so they're signed as
well by the usual mechanismn).
My initial view is that any delta package system that doesn't
reproduce
On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
* Brian Eaton ([EMAIL PROTECTED]) [060501 16:42]:
> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
> >
> >If one does it right, it might be enough if the original package is
> >*installed*. And that happens quite often, e.g. even for securi
* Brian Eaton ([EMAIL PROTECTED]) [060501 17:49]:
> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
> >* Brian Eaton ([EMAIL PROTECTED]) [060501 16:42]:
> >> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
> >> >
> >> >If one does it right, it might be enough if the original package is
> >>
* Brian Eaton ([EMAIL PROTECTED]) [060501 16:42]:
> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
> >* Brian Eaton ([EMAIL PROTECTED]) [060501 15:51]:
> >> The only time delta packages will be a win is for upgrades where the
> >> client has the original package cached.
> >
> >If one does it r
Le Lun 1 Mai 2006 16:35, Brian Eaton a écrit :
> On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
> > * Brian Eaton ([EMAIL PROTECTED]) [060501 15:51]:
> > > The only time delta packages will be a win is for upgrades where
> > > the client has the original package cached.
> >
> > If one does it
On 5/1/06, Andreas Barth <[EMAIL PROTECTED]> wrote:
* Brian Eaton ([EMAIL PROTECTED]) [060501 15:51]:
> The only time delta packages will be a win is for upgrades where the
> client has the original package cached.
If one does it right, it might be enough if the original package is
*installed*.
Le Lun 1 Mai 2006 15:31, Brian Eaton a écrit :
> On 4/30/06, Goswin von Brederlow wrote:
> > Look at zsync and help develope it far enough so it can look into
> > debs. Without that the gain is practicaly 0 or less.
>
> It's entirely possible that the gain will be nothing no matter what
> algorithm
* Brian Eaton ([EMAIL PROTECTED]) [060501 15:51]:
> The only time delta packages will be a win is for upgrades where the
> client has the original package cached.
If one does it right, it might be enough if the original package is
*installed*. And that happens quite often, e.g. even for security
On 4/30/06, Goswin von Brederlow <[EMAIL PROTECTED]> wrote:
> Brian Eaton <[EMAIL PROTECTED]> wrote:
>> http://rsync.samba.org/rsync-and-debian/rsync-and-debian.html
>>
>> Has anyone ever done some log file analysis to figure out how much
>> bandwidth
On 5/1/06, Brian Eaton <[EMAIL PROTECTED]> wrote:
Hello all -
Regarding the ideas discussed here:
http://rsync.samba.org/rsync-and-debian/rsync-and-debian.html
A few comments:
3.2 rsync is too hard on servers
That document claims it's an avoidable cost. It's not really, be
> * Goswin von Brederlow:
> > Look at zsync and help develope it far enough so it can look into
> > debs. Without that the gain is practicaly 0 or less.
The other thing to do would be to lobby for dpkg-deb and dpkg-source to
use 'gzip --rsyncable' when building stuff. (That, or sneak
"--rsyncabl
* Goswin von Brederlow:
> Tyler MacDonald <[EMAIL PROTECTED]> writes:
>
>> Brian Eaton <[EMAIL PROTECTED]> wrote:
>>> http://rsync.samba.org/rsync-and-debian/rsync-and-debian.html
>>>
>>> Has anyone ever done some log file analysis to
sition is also on chunk boundaries.
>
> Rsync has a per chunk Alder-32 and md4 checksum. Those chunk checksums
> are compared to a chunk at every byte position in the file. The
> Adler-32 checksum is fairly weak but it can be updated from one
> position to the next with minimal work. Only wh
Tyler MacDonald <[EMAIL PROTECTED]> writes:
> Brian Eaton <[EMAIL PROTECTED]> wrote:
>> http://rsync.samba.org/rsync-and-debian/rsync-and-debian.html
>>
>> Has anyone ever done some log file analysis to figure out how much
>> bandwidth would be saved by
Brian Eaton <[EMAIL PROTECTED]> wrote:
> http://rsync.samba.org/rsync-and-debian/rsync-and-debian.html
>
> Has anyone ever done some log file analysis to figure out how much
> bandwidth would be saved by transferring package deltas instead of
> entire new packages?
Hello all -
Regarding the ideas discussed here:
http://rsync.samba.org/rsync-and-debian/rsync-and-debian.html
Has anyone ever done some log file analysis to figure out how much
bandwidth would be saved by transferring package deltas instead of
entire new packages?
Assuming someone hasn't
esn't
> provide any cryptographic assurances.
Just as a side note other developers may be interested in knowing, the
debian keyring can be synced via rsync. I personally like having a
mostly-up-to-date copy of it in my computer.
% cat /etc/cron.weekly/LOCAL-update-keyring
> > > IIRC the problem is that rsync is quite CPU-heavy on the servers, so while
> > > the mirrors have the (network) resources to feed downloads to 100s of
> > > users, they don't have the (CPU) resources for a few dozen rsyncs.
> >
> > Why do you
tatement about cpu load of rsync, and did that exactly once,
so I don't "keep saying it". Also, I put in an IIRC, so you have clear
indication that I'm not too sure - somebody asked about the reason, I
answered with that I heard was the reason when the last person asked.
files change. Files should never change. You
> get a completly new file. Unless you can somehow tell to use the
> old file as base, this is not going to help.
there was a patch by paul russell floating around to build a heuristic
into rsync to allow it to figure out from the name which
On Thu, Nov 04, 2004 at 06:35:40PM +0100, Kurt Roeckx wrote:
> On Thu, Nov 04, 2004 at 05:46:55PM +0100, Otto Wyss wrote:
> >
> > Now if you feel advantous, repack as many package on the source mirror
> > with gzip --rsyncable and notice the difference.
>
> Exactly how is this going to help? I c
On Thu, Nov 04, 2004 at 05:46:55PM +0100, Otto Wyss wrote:
>
> Now if you feel advantous, repack as many package on the source mirror
> with gzip --rsyncable and notice the difference.
Exactly how is this going to help? I can only see this as being
useful when the files change. Files should nev
> > Can anyone explain why rsync is no longer considered an appropriate
> > method for fetching Packages files?
>
> IIRC the problem is that rsync is quite CPU-heavy on the servers, so while
> the mirrors have the (network) resources to feed downloads to 100s of
> users,
Package: wnpp
Severity: wishlist
* Package name: zsync
Version : 0.0.4
Upstream Author : Colin Phipps <[EMAIL PROTECTED]>
* URL : http://zsync.moria.org.uk/
* License : GPLv2
Description : A client-side implementation of the rsync algorithm
This p
On Tue, Oct 26, 2004 at 12:20:19AM -0700, Ian Bruce said
> Now that gzip has the "--rsyncable" option, wouldn't it be feasible to
> rsync against compressed Packages files rather than having to keep the
> uncompressed ones around for this purpose?
You have to explicitly e
On Thu, Oct 28, 2004 at 01:54:54PM +0200, Adrian 'Dagurashibanipal' von Bidder
wrote:
> IIRC the problem is that rsync is quite CPU-heavy on the servers, so
> while the mirrors have the (network) resources to feed downloads to
> 100s of users, they don't have the (
On Oct 26, Ian Bruce <[EMAIL PROTECTED]> wrote:
> Can anyone explain why rsync is no longer considered an appropriate
> method for fetching Packages files? It's the only mechanism I'm aware of
Because it's hard on servers, for a start.
--
ciao, |
Marco | [8782 diF
On Tuesday 26 October 2004 09.20, Ian Bruce wrote:
> Can anyone explain why rsync is no longer considered an appropriate
> method for fetching Packages files?
IIRC the problem is that rsync is quite CPU-heavy on the servers, so while
the mirrors have the (network) resources to feed downlo
Can anyone explain why rsync is no longer considered an appropriate
method for fetching Packages files? It's the only mechanism I'm aware of
that makes "apt-get update" over a 56Kb/s connection complete in a
reasonable length of time. Am I missing something?
Begin forwarded
> From time to time the question arises on different forums whether it is
> possible to efficiently use rsync with apt-get. Recently there has been a
> thread here on debian-devel and it was also mentioned in Debian Weekly News
> June 24th, 2003. However, I only saw different small par
Michael Karcher <[EMAIL PROTECTED]> writes:
> On Sun, Jul 06, 2003 at 01:29:06AM +0200, Andrew Suffield wrote:
> > It should put them in the package in the order they came from
> > readdir(), which will depend on the filesystem. This is normally the
> > order in which they were created,
> As long
On Sun, Jul 06, 2003 at 01:29:06AM +0200, Andrew Suffield wrote:
> It should put them in the package in the order they came from
> readdir(), which will depend on the filesystem. This is normally the
> order in which they were created,
As long as the file system uses an inefficient approach for dir
On Mon, Jul 07, 2003 at 01:01:34AM +0100, Andrew Suffield wrote:
> >
> > I believe htree == dir_index, so tune2fs(8) and mke2fs(8) have the answer.
>
> My /home has that enabled and readdir() returns files in creation order.
>
Then you don't have a htree-capable kernel or the directory isn't
in
On Sun, Jul 06, 2003 at 11:36:34PM +0100, Andrew Suffield wrote:
>
> I can only presume this is new or obscure, since everything I tried
> had the traditional behaviour. Can't see how to turn it on, either.
>
It's new for 2.5. Backports to 2.4 are available here:
http://thunk.org/tytso
On Sun, Jul 06, 2003 at 07:28:09PM -0400, Matt Zimmerman wrote:
> On Sun, Jul 06, 2003 at 11:36:34PM +0100, Andrew Suffield wrote:
>
> > On Sun, Jul 06, 2003 at 05:48:24PM -0400, Theodore Ts'o wrote:
> > > Err, no. If the htree (hash tree) indexing feature is turned on for
> > > ext2 or ext3 file
On Sun, Jul 06, 2003 at 11:36:34PM +0100, Andrew Suffield wrote:
> On Sun, Jul 06, 2003 at 05:48:24PM -0400, Theodore Ts'o wrote:
> > Err, no. If the htree (hash tree) indexing feature is turned on for
> > ext2 or ext3 filesystems, they will returned sorted by the hash of the
> > filename --- eff
, very surprised if reiserfs returned files in creation
> order.
Some trivial testing indicates that it does. Heck if I know how or why.
> It is really, really bad assumption to assume that files will be
> returned in the same order as they were created.
However, there's no real ne
don't know whether the
filelist is sorted on the fly or the files really appear alphabetically in
the cpio archive.
So I guess we've already seen pros and cons of sorting the files. (One
thing is missing: we still don't know how efficient rsync is if two
rsyncable tar.gz files
On Sun, Jul 06, 2003 at 10:12:03PM +0100, Andrew Suffield wrote:
> On Sun, Jul 06, 2003 at 10:28:07PM +0200, Koblinger Egmont wrote:
> > Yes, when saying "random order" I obviously ment "in the order readdir()
> > returns them". It's random for me. :-)))
> >
> > It can easily be different on diff
Hi,
On 6 Jul 2003, Goswin Brederlow wrote:
> 2. most of the time you have no old file to rsync against. Only
> mirrors will have an old file and they already use rsync.
This is definitely true if you install your system from CD's and then
upgrade it. However, if you keep on upg
On Sun, Jul 06, 2003 at 10:28:07PM +0200, Koblinger Egmont wrote:
>
> On Sun, 6 Jul 2003, Andrew Suffield wrote:
>
> > It should put them in the package in the order they came from
> > readdir(), which will depend on the filesystem. This is normally the
> > order in which they were created, and s
On Sun, 6 Jul 2003, Andrew Suffield wrote:
> It should put them in the package in the order they came from
> readdir(), which will depend on the filesystem. This is normally the
> order in which they were created, and should not vary when
> rebuilding. As such, sorting the list probably doesn't c
On Sun, Jul 06, 2003 at 12:37:00PM +1200, Corrin Lakeland wrote:
> > 4. (and this is the knockout) rsync support for apt-get is NO
> > WANTED. rsync uses too much resources (cpu and more relevant IO) on
> > the server side and a widespread use of rsync for apt-get would choke
>
On Sun, 2003-07-06 at 09:27, Goswin Brederlow wrote:
> 4. (and this is the knockout) rsync support for apt-get is NO
> WANTED. rsync uses too much resources (cpu and more relevant IO) on
> the server side and a widespread use of rsync for apt-get would choke
> the rsync mirrors and
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
On Sunday 06 July 2003 11:27, Goswin Brederlow wrote:
> Koblinger Egmont <[EMAIL PROTECTED]> writes:
> > Hi,
> >
> > >From time to time the question arises on different forums whether it is
> >
> > poss
e to fetch blocks from normal http
and ftp mirrors. This will be used to start fetching data before connections
have been opened with peers.
> Bittorrent calculates a hash for each block of a file very similar to
> what rsync needs to work. Via another small extension rolling
> checksums f
Koblinger Egmont <[EMAIL PROTECTED]> writes:
> Hi,
>
> >From time to time the question arises on different forums whether it is
> possible to efficiently use rsync with apt-get. Recently there has been a
> thread here on debian-devel and it was also mentioned in Debian
ld'' using the sortdir library ([4a], [4b]) which
> makes the files appear in the package in alphabetical order. I don't know
> how efficient rsync is if you split a file to some dozens or even hundreds
> of parts and shuffle them, and then syncronize this one with the origi
Hi,
>From time to time the question arises on different forums whether it is
possible to efficiently use rsync with apt-get. Recently there has been a
thread here on debian-devel and it was also mentioned in Debian Weekly News
June 24th, 2003. However, I only saw different small parts of a h
>> But why at the end of http://home.tiscali.cz:8080/~cz210552/aptrsync.html :
>> # Get anything we missed due to failed rsync's. [EMAIL PROTECTED] 24 Mar
>> 2002.
>> os.system('apt-get update')
well, it seems for me this just starts apt-get getting everything all
over again, http_proxy or not.
>> Doing apt-get update just seems to start downloading the Packages.gz
>> even though we just rsynced Packages.
Tim> It could easily be a bug.
Radim> It writes HIT! message there and skip this file, because it is
Radim> up-to-date by rsync.
Next time I will try with http_p
s. Is apt supposed to detect
>Packages are rater fresh and not download? It just downloaded over
^ I can't quite guess your meaning here.
>again for me.
It could easily be a bug. The rsync servers I was hitting randomly
rejected connections, so I didn't reliabl
ed over
again for me.
And of course commenting out apt-get update means that if some of the
servers in sources.list don't run rsync, then they won't be hit.
sy, and hasn't put it in upstream in the last year, so I
> think we shouldn't wait.
gzip (1.3.5-4) unstable; urgency=low
* merge patch from Rusty Russell that adds --rsyncable option to gzip.
This modifies the output stream to allow rsync to transfer updated .gz
files much more
ot are they in separate
> files on the mirrors.
Dunno, if it isn't available then it sounds like a good thing for a feature
request. Though at modem speeds, 12MB only takes one hour.
> Cor> There are technical solutions to precomputing the diffs used by
> Cor> rsync, as well
Re: Package Lists and Size, linux.debian.devel
Cor> Some of the servers run rsync, which works well for the Packages
Cor> file, but does not work for the packages themselves.
OK, will putting rsync in one's sources.list as you say below just
affect the Packages file fetching,
On Mon, May 19, 2003 at 04:25:20PM +0400, Alexander Kotelnikov wrote:
> > On Mon, 19 May 2003 18:28:19 +1000
> > "AP" == Andrew Pollock <[EMAIL PROTECTED]> wrote:
> AP>
> AP>
> AP> This would be made easier if the DSA's were obtainable in a more
> parseable
> AP> format.
>
> DSA'a are
> On Mon, 19 May 2003 18:28:19 +1000
> "AP" == Andrew Pollock <[EMAIL PROTECTED]> wrote:
AP>
AP>
AP> This would be made easier if the DSA's were obtainable in a more parseable
AP> format.
DSA'a are available in RDF. There exists a link on the bottom of
www.debian.org/security to it.
-
On Mon, May 19, 2003 at 06:28:19PM +1000, Andrew Pollock wrote:
> This would be made easier if the DSA's were obtainable in a more parseable
> format. Currently they're being retrieved from
> http://www.debian.org/security/ via a recursive wget.
Perhaps CVS would be easier?
:pserver:[EMAIL PROTE
y're being retrieved from
http://www.debian.org/security/ via a recursive wget. Would it be
possible, and would it be beneficial to anyone else, if they were made
available individually via rsync?
Andrew
[1]
http://lists.debian.org/debian-devel/2003/debian-devel-200305/msg00468.html
based backup system using rsync
A utility to maintain multiple backups on online storage, each backup is
available as a sort of snapshot directory, where common files are shared
between the different backup generations. It uses rsync to do the actual
copying.
.
Backups can be made locally or
On 13 Apr 2002, Brian May <[EMAIL PROTECTED]> wrote:
> On Fri, Apr 12, 2002 at 10:19:27PM +1000, Donovan Baarda wrote:
> > The big problem with rproxy is it's implemented in perl (perl: crypto for
There might be some other unrelated program called rproxy that's in
Perl, but the one I wrote certain
> > http://rsync.samba.org/rsync-and-debian/
> >
> > I'd appreciate comments.
>
This is certainly a very informative page. I'd appreciate if the CPU
load problem could be solved somehow.
IMO the versioning patch from Paul Russell is not the right approach
since
1 - 100 of 181 matches
Mail list logo