"Camilo Gonzalez" <[EMAIL PROTECTED]> wrote in message
news:[EMAIL PROTECTED]
> Okay, I'm still struggling here. My problem is I have a client who has a
> rather large tab delimited text file I am attempting to slurp up and
> place in a MySQL table. The file is almost 6 megs large but my ISP only
> allows 2 megs of RAM per user. I can slurp up only about 1.5 megs before
> I get an error message. I would like to read in only about 1.5 megs at a
> time but if I use the following I exceed memory limits:
>
> while (sysread (TEMP,  $temp, 1_500_000))
>           {
>                # read into MySQL
>            }
>
> Is there a way to step through a large file and process only what I read
> in? I'm so stymied I wanna puke.
>

Sure... use the readline function or <FH> notation:

while ( <TEMP> ) {
  # $_ holds your record
}

This puts only one record at a time in to memory.

Todd W.



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to