Using the Riak Java client, I am executing a search map reduce like this:

MapReduceResult result = riakClient.mapReduce(SEARCH_BUCKET,
search).execute();

String search = "systemId:" + systemName + " AND indexId:" + indexId;

MapReduceResult result = riakClient.mapReduce(SEARCH_BUCKET,
search).execute();

This worked fine when the bucket contained a few thousand keys. Now that we
have far more data stored in the bucket (at least 250K keys), it's throwing
this generic error:

com.basho.riak.client.RiakException: java.io.IOException:
{"error":"map_reduce_error"}

We've also noticed that storing new key/values in the bucket has slowed WAY
down.

Any idea what's going on? Are there limitations to Search Map Reduce? Are
there configuration options that need changed? Any help would be greatly
appreciated.

-- 
Roger Diller
Flex Rental Solutions, LLC
Email: ro...@flexrentalsolutions.com
Skype: rogerdiller
Time Zone: Eastern Time
_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to