Hello, I am having issues with writing standardout to disk. ( in chunks) I have a small backup script that runs. It uses Net::SSH::Perl to send a tar command to the hoem directory of a server. The tar command pipes eveything to STDOUT, on the backend I write that STDOUT to disk. Everything works fine for small sites. Because I am writing to memory the only issue I have is when I run into a site that has more disk space then the "backup" server has memory. STDOUT uses all the memory and the box dies. So I thought I could write chunks of STDOUT to a file on the disk at certian intervals, thus avoiding the memory issues.
<snip> use strict; use diagnostics; $|++; use lib "."; use ServerBackup; use Net::SSH::Perl; use FileHandle; use File::Path; use Net::SSH::Perl::Buffer; use Compress::Zlib; my $user = "netop"; my $hostname = "sonedomain.com"; my $dbg = 1; my $ssh = Net::SSH::Perl->new($hostname, identity_files =>["$id_key_fn"], port => 22, debug => $dbg); $ssh->login($user); my ($home_list, $home_err, $home_exit) = $ssh->cmd($listhome); my @home_users = split " ", $home_list; foreach my $home_user (@home_users) { my $buffer = Net::SSH::Perl::Buffer->new; my $gz = gzopen("/export/home/server/scripts/widow/test/$home_user\.tar.gz", "w"); my ($home_out, $home_err, $home_exit); while (($home_out, $home_err, $home_exit) = $ssh->cmd("cd /; /usr/bin/nice /bin/tar cpf - /home/$home_user")){ my $gz = gzopen("/export/home/server/scripts/widow/test/$home_user\.tar.gz", "w"); $buffer->put_int32($home_out); my $int = $buffer->get_int32; my $tgz_user = $gz->gzwrite($int); } $gz->gzclose(); } </snip> I think I may have bitten off more then I can chew, it just "ain't" working. It creates the file but doesn't write everything in STDOUT to the file. IS any familiar with Net::SSH::Perl::Buffer. Or is there another way I can do this? A perl module I don't know about? THanks for the help. chad -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]