Hi there
I have been trying to get this (traffic shaping) working for a while, what
packages are you using to achieve the CBQ? What debian release are you
using and what kernel version are you using?
TIA.
Regards
David Anso
- Original Message -
From: "José Carlos Ramírez Pérez" <[EMA
Hi,
I think the simplest way is to use dd. I made a backup with:
dd if=/dev/sdb bs=2048 count=70 | gzip -v > backup_cd1.img.gz
dd if=/dev/sdb bs=2048 skip=70 | gzip -v > backup_cd2.img.gz
To put it back, mount the cd's and:
gunzip -cd backup_cd1.img.gz | dd of=/dev/sdb bs=2048 count=7000
José Carlos Ramírez Pérez <[EMAIL PROTECTED]> writes:
> On the other hand, I have a Squid proxy running on the same machine and
> can't control with CBQ the traffic it generates. This is because the
> communication is between user-proxy, and between proxy-internet, so if I
> choose to limit traff
Hi there
I have been trying to get this (traffic shaping) working for a while, what
packages are you using to achieve the CBQ? What debian release are you
using and what kernel version are you using?
TIA.
Regards
David Anso
- Original Message -
From: "José Carlos Ramírez Pérez" <[EM
I'm trying to make backups to CD, and of course, have 800-900 megs worth of
data, compressed. What is the best way to split up large tar or cpio files,
that will allow them to easily be put back together, booting off of a rescue
floppy or the like? I don't need any scripts or direct-to-the-burner
Hi,
I think the simplest way is to use dd. I made a backup with:
dd if=/dev/sdb bs=2048 count=70 | gzip -v > backup_cd1.img.gz
dd if=/dev/sdb bs=2048 skip=70 | gzip -v > backup_cd2.img.gz
To put it back, mount the cd's and:
gunzip -cd backup_cd1.img.gz | dd of=/dev/sdb bs=2048 count=700
José Carlos Ramírez Pérez <[EMAIL PROTECTED]> writes:
> On the other hand, I have a Squid proxy running on the same machine and
> can't control with CBQ the traffic it generates. This is because the
> communication is between user-proxy, and between proxy-internet, so if I
> choose to limit traf
I'm trying to make backups to CD, and of course, have 800-900 megs worth of
data, compressed. What is the best way to split up large tar or cpio files,
that will allow them to easily be put back together, booting off of a rescue
floppy or the like? I don't need any scripts or direct-to-the-burne
thanks ,
I have installed kernel 2.4.1
with the latest util-linux ,modutils and raidtool (arrrgg)
and finally I can write file >2Gb :-)
...now i must rebuild ls, mv etc.
m.
On Tue, Feb 06, 2001 at 09:33:06AM -0800 or thereabouts, brian moore wrote:
> On Tue, Feb 06, 2001 at 12:06:16A
Hello all.
I've setup a traffic shaper (or should I say a bandwidth control policy)
in my Debian router using CBQ, with the unvaluable help of cbq.init from
Pavel Golubev (I've slightly modified it to be able to create
non-bounded classes and to specify prioritized filter rules). I've
created seve
thanks ,
I have installed kernel 2.4.1
with the latest util-linux ,modutils and raidtool (arrrgg)
and finally I can write file >2Gb :-)
...now i must rebuild ls, mv etc.
m.
On Tue, Feb 06, 2001 at 09:33:06AM -0800 or thereabouts, brian moore wrote:
> On Tue, Feb 06, 2001 at 12:06:16
Hello all.
I've setup a traffic shaper (or should I say a bandwidth control policy)
in my Debian router using CBQ, with the unvaluable help of cbq.init from
Pavel Golubev (I've slightly modified it to be able to create
non-bounded classes and to specify prioritized filter rules). I've
created sev
12 matches
Mail list logo