From: Luke <[EMAIL PROTECTED]>
> Hello,
> I am looking for a proper, fastest and most reasonable way to insert
> data from pretty big file (~1,000,000 lines) to database. I am using
> Win32::ODBC (ActiveState Perl) module to connect with Access/MSSQL
> database and inserting line after line. I was wondering if there is a
> better way to do it...
> Maybe create hash with part of data (maybe all of it - what are the
> limitations ?)
> What is other way to do it instead 'INSERT INTO...' statement after
> reading each line ?

I you stick to Perl, you should use DBI and DBD::ODBC, not 
Win32::ODBC. You should use ->prepare() and ->execute(). And you 
should turn the AutoCommit off and commit only every 100 or 1000 
rows.

Jenda
===== [EMAIL PROTECTED] === http://Jenda.Krynicky.cz =====
When it comes to wine, women and song, wizards are allowed 
to get drunk and croon as much as they like.
        -- Terry Pratchett in Sourcery


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to