From: "Kipp, James" <[EMAIL PROTECTED]>
> I am working on a Windows NT box and I don't have the luxury of any
> file splitting utilities. We have a data file with fixed length
> records. I was wondering the most efficient way of splitting the file
> into 5 smaller files. Thought ( Hoping :-) ) some one out there may
> have done something like this.

# untested code !!!
# please add error checking !!!
use strict;
my $record_length = ...;
my $num_parts = 5;

my $chunk = 1024 * $record_length; 
        # or something else. I just want the $chunk to be a nice number
        # yet be sure the chunk contains complete records
        # I assume the $chunk will be much smaller than the size of the
        # whole file.

my $file_size = -s $filename;
my $chunks_in_part = int($file_size / ($chunk * 5));

open IN, $filename;
binmode(IN);

my $buff;
foreach my $part (1 .. $num_parts) {
        open OUT, "> $filename.$part";
        binmode(OUT);
        for(my $i = 1; $i <= $chunks_in_part ; $i++) {
                sysread IN, $buff, $chunk;
                syswrite OUT, $chunk;
        }
        if ($part == $num_parts) { # write the rest to the last file
                while (sysread IN, $buff, $chunk) {
                        syswrite OUT, $chunk;
                }
        }
        close OUT;
}


I think you get the idea. Simply ... read the file in chunks (N*4KB 
at least) that contain whole records and use sysread() and 
syswrite(). 

Jenda
===== [EMAIL PROTECTED] === http://Jenda.Krynicky.cz =====
When it comes to wine, women and song, wizards are allowed 
to get drunk and croon as much as they like.
        -- Terry Pratchett in Sourcery


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to