> It took 219 minutes to insert 12+ million docs which translates to about 913
> docs/second using batch_insert in batches of 1250 documents per batch.

How big are the documents and/or how big is the resulting data when loaded?

What is your data model - is each document a single column? Or a row
containing multiple columns? "913 docs/second" can be low or high or
expected, very much depending on what that means in terms of rows,
columns and sizes.

Did you observe what the bottleneck were during insertion? Were you
inserting using a single client or multiple concurrent clients to make
sure you're not bottlenecking there?

(I have no idea how fast phpcassa is.)

-- 
/ Peter Schuller

Reply via email to