This may be another max file handler type of error. Or maybe even an oom thing 
if the key length is large.


On Jul 30, 2010, at 4:59 PM, Grant Schofield wrote:

> I am not sure if you hit an already fixed bug in Bitcask or not. What version 
> of Riak are you running on currently?
> 
> Grant Schofield
> Developer Advocate
> Basho Technologies
> 
> On Jul 30, 2010, at 1:28 PM, Ken Matsumoto wrote:
> 
>> Hi all,
>> 
>> I just tried to insert 1Billion data records.
>> But I got the "write_lock" error after 12Million data.
>> What is the reason and how should I avoid this?
>> I use bitcask (default) backend and no parameters changed in config file.
>> 1 record is just 70B text data.
>> 
>> Regards,
>> 
>> Ken.
>> 
>> -- 
>> Ken Matsumoto
>> VP / Research & Development
>> Nomura Research Institute America, Inc.
>> NRI Pacific
>> 1400 Fashion Island Blvd., Suite 1010
>> San Mateo, CA 94404, U.S.A.
>> 
>> PLEASE READ:This e-mail is confidential and intended for the named recipient 
>> only. If you are not an intended recipient, please notify the sender and 
>> delete this e-mail.
>> 
>> 
>> _______________________________________________
>> riak-users mailing list
>> riak-users@lists.basho.com
>> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
> 
> 
> _______________________________________________
> riak-users mailing list
> riak-users@lists.basho.com
> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com


_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to