I haven't dealt with MS databases in several years but some of this
stuff will likely still apply. First a couple notes on the underlying
databases, if you're running Access be sure to compact the database
after the insert particularly if this is going to be a repeated process,
in addition to
First of all, you should try and use the database native tools for loading
tables this size...
If it has to be perl which does happen sometimes, you should definetly
switch of autocommit. Try doing a commit every 5 lines or even every
10 lines (ask your friendly DBA how big the transaction
From: Luke <[EMAIL PROTECTED]>
> Hello,
> I am looking for a proper, fastest and most reasonable way to insert
> data from pretty big file (~1,000,000 lines) to database. I am using
> Win32::ODBC (ActiveState Perl) module to connect with Access/MSSQL
> database and inserting line after line. I was
On 22 Sep, 02:58, [EMAIL PROTECTED] (Luke) wrote:
> Hello,
> I am looking for a proper, fastest and most reasonable way to insert
> data from pretty big file (~1,000,000 lines) to database. I am using
> Win32::ODBC (ActiveState Perl) module to connect with Access/MSSQL
> database and inserting line
Luke schreef:
> I am looking for a proper, fastest and most reasonable way to insert
> data from pretty big file (~1,000,000 lines) to database.
Make the file have a format as needed by the data import tool of the
database system.
Often a CSV format is supported.
See also bcp:
http://technet.mic