Hi,
the fastest way to import data is to use a "LOAD DATA INFILE" statement
in mySQL (for postgresql, use "COPY FROM"). This will bypass a lot of
overhead. Be sure to remove all indices and all the restraints you can
for they as well take time. Just add them afterwards, it is a lot
faster to build up an index for a complete table than to update it time
after time.
If you're handy with C, look at the code in raw-convert.c. That
particular bit of code was used to create TAB-delimited datafiles
(usable by mysql and postgresql); the datafiles where read using COPY
FROM (it was a postgresql system) to import 10 GB of mail in only 50
minutes.
Important postgresql note: you MUST update the sequences afterwards as
they are not affected by the copy from command (yes i did forget it :-)
regards roel
John Heller heeft op woensdag, 18 dec 2002 om 06:35 (Europe/Amsterdam)
het volgende geschreven:
The plan is to use mysql.
Eelco van Beek - IC&S wrote:
Hi,
We once wrote a program to insert 10 Gb of mail into dbmail. The
program did this in about 50 minutes.
What database are you going to use?
Best regards,
Eelco
_______________________________________________
Dbmail mailing list
Dbmail@dbmail.org
https://mailman.fastxs.nl/mailman/listinfo/dbmail