Re: handling large data files

2002-10-07 Thread Janek Schleicher
Jerry Preston wrote: > Hi!, > > I am look for a better way and a faster way to deal with a 4 - 8 meg data > file. This file has been saved as an .cvs file for excel to read in. > > All I am interested in is the first three cells of ',' delimited data. > > Die,Row 0, Column 11 > Test Result,

Re: handling large data files

2002-10-06 Thread Jenda Krynicky
> I am look for a better way and a faster way to deal with a 4 - 8 meg > data file. This file has been saved as an .cvs file for excel to > read in. > As others pointed out it's always better to process the file as you read it instead of slurping it whole into memory and processing the array

Re: handling large data files

2002-10-04 Thread James Edward Gray II
On Friday, October 4, 2002, at 09:21 AM, Jerry Preston wrote: > Hi!, Howdy. > I am look for a better way and a faster way to deal with a 4 - 8 meg > data > file. This file has been saved as an .cvs file for excel to read in. A "better" way, is pretty open to interpretation, so here's my

Re: handling large data files

2002-10-04 Thread John W. Krahn
Jerry Preston wrote: > > Hi!, Hello, > I am look for a better way and a faster way to deal with a 4 - 8 meg data > file. This file has been saved as an .cvs file for excel to read in. > > [snip] > > open( FI, $file_path ) || die "unable to open $file_path $!\n"; > @file_data = ; > c

handling large data files

2002-10-04 Thread Jerry Preston
Hi!, I am look for a better way and a faster way to deal with a 4 - 8 meg data file. This file has been saved as an .cvs file for excel to read in. All I am interested in is the first three cells of ',' delimited data. Die,Row 0, Column 11 Test Result,1 Score,1 PMark Score,0 k Score,0 Scor