Even more when you compare to a script executing the inserts, instead the
mysql client...
Olaf
On 6/5/08 12:06 PM, "mos" <[EMAIL PROTECTED]> wrote:
> At 10:30 AM 6/5/2008, you wrote:
>> Simon,
>>
>> In my experience load data infile is a lot faster than a sql file htrough
>> the client.
>> I
Olaf, Mike
Thanks for the input, the blob data is just text, I'll have a go at
using the load data command
Regards
Simon
mos wrote:
At 10:30 AM 6/5/2008, you wrote:
Simon,
In my experience load data infile is a lot faster than a sql file
htrough
the client.
I would parse the sql file an
At 10:30 AM 6/5/2008, you wrote:
Simon,
In my experience load data infile is a lot faster than a sql file htrough
the client.
I would parse the sql file and create a csv file with just the columns of
your table and then use load data infile using the created csv file
Olaf
Olaf,
Using a
Simon,
In my experience load data infile is a lot faster than a sql file htrough
the client.
I would parse the sql file and create a csv file with just the columns of
your table and then use load data infile using the created csv file
Olaf
On 6/5/08 4:52 AM, "Simon Collins" <[EMAIL PROTECTED]>
AIL PROTECTED]> wrote:
From: Simon Collins <[EMAIL PROTECTED]>
Subject: Re: Large import into MYISAM - performance problems
To: mysql@lists.mysql.com
Date: Thursday, June 5, 2008, 3:05 PM
I'm loading the data through the command below mysql -f -u root -p
enwiki < enwiki.sql
I can do - if the load data infile command definitely improves
performance and splitting the file does the same I have no problem with
doing this. It just seems strange that it's problems with the way the
import file is configured. I thought the problem would be somehow with
the table getting
You could load the data into several smaller tables and combine them
into a merged table which would have no real effect on the schema.
Ade
Simon Collins wrote:
I'm loading the data through the command below mysql -f -u root -p
enwiki < enwiki.sql
The version is MySQL 5.0.51a-community
I've
Simon,
Why dont u split the file and use LOAD DATA INFILE command which would
improve the performance while loading into an empty table with keys
disabled.
regards
anandkl
On 6/5/08, Simon Collins <[EMAIL PROTECTED]> wrote:
>
> I'm loading the data through the command below mysql -f -u root -p e
I'm loading the data through the command below mysql -f -u root -p
enwiki < enwiki.sql
The version is MySQL 5.0.51a-community
I've disabled the primary key, so there are no indexes. The CPU has 2
cores and 2 Gigs memory.
The import fell over overnight with a "table full" error as it hit 1T (
Hi,
Break up the file into small chunks and then import one by one.
On Wed, Jun 4, 2008 at 10:12 PM, Simon Collins <
[EMAIL PROTECTED]> wrote:
> Dear all,
>
> I'm presently trying to import the full wikipedia dump for one of our
> research users. Unsurprisingly it's a massive import file (2.7T)
Simon,
As someone else mentioned, how are you loading the data? Can you
post the SQL?
You have an Id field, so is that not the primary key? If so, the
slowdown could be maintaining the index. If so, add up to 30% of your
available ram to your key_bufer_size in your my.cnf file
Hi Simon,
How ur doing this import into ur table.
On 6/4/08, Simon Collins <[EMAIL PROTECTED]> wrote:
>
> Dear all,
>
> I'm presently trying to import the full wikipedia dump for one of our
> research users. Unsurprisingly it's a massive import file (2.7T)
>
> Most of the data is importing into
12 matches
Mail list logo