On 22 Sep, 02:58, [EMAIL PROTECTED] (Luke) wrote:
> Hello,
> I am looking for a proper, fastest and most reasonable way to insert
> data from pretty big file (~1,000,000 lines) to database. I am using
> Win32::ODBC (ActiveState Perl) module to connect with Access/MSSQL
> database and inserting line after line. I was wondering if there is a
> better way to do it...
> Maybe create hash with part of data (maybe all of it - what are the
> limitations ?)
> What is other way to do it instead 'INSERT INTO...' statement after
> reading each line ?

DBI has an execute_array method that can allow DBD drivers to optimize
such operations.

Unfortunately AFAIK none of the DBD drivers I've encountered do any
significant  optimisation.

For efficient bulk inserts I usually fall back on writing a file and
using the underlying database's bulk insert tool. This, of course,
does not give portability between databases.


-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
http://learn.perl.org/


Reply via email to