> On Thu, 14 Oct 2004, Dave Kettmann wrote: > > > The reply was deserved :) Just another question before I go too far > > with this... The files I am parsing (just needing 2 tabbed > fields out > > of them) are approximately 20,000 - 25,000 lines long a > piece. Each of > > these files will be globbed into one file, but that is something > > completely different. I guess my question is, would I be better off > > calling exec(cut) with files of this size for ease of use? Guess I > > should have mentioned this in my previous email. > > Not necessarily. > > I seem to remember that as long as you're iterating over a > small window > of the file at any given time, you don't necessarily end up > slurping the > whole thing into memory at once. > > How long is each line? How large are the files, bytewise? And > how much > memory (etc) do you have to work with? > > This is going to be a situations where benchmarks are invaluable. > > > > > -- > Chris Devers >
Each line is probably 80-100 characters in legnth, the files are about 300Kb each (6 files total) working with 1GB of memory. Looking at these numbers, dont know that these are really that big of a file, but seem like it when you look at them in vi ;)... I guess I will give slice a shot and see what I can do with it, I will keep you and the list updated :) Dave Kettmann -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response>