Updating Riak values in batch

2016-06-13 Thread Ricardo Mayerhofer
Hi all, I've a large dataset in Riak (about 20 million keys), storing JSON documents. I'd like to update those documents to remove a JSON attribute. What's the best way to approach this problem in Riak? Thanks. -- Ricardo Mayerhofer ___ riak-users mail

Re: riak-users Digest, Vol 83, Issue 9

2016-06-13 Thread Ryan R Sundberg
/bitcask/50239118783249787813251666124688006726811648/76.bitcask.data","/apps/riak/lib/bitcask/50239118783249787813251666124688006726811648/77.bitcask.data","/apps/riak/lib/bitcask/50239118783249787813251666124688006726811648/78.bitcask.data"],[]} > in

Riak on Solaris/OmniOS/illumos

2016-06-13 Thread Henrik Johansson
Hi, I've recently been told that Riak will no longer be supported on Solaris/illumos based distributions. At the same time ZFS was recommended which I find a bit strange since ZFS comes from Solaris/illumos and is still the most tested of the platform. There are also Riak probes for DTrace. Ca

Re: Massive json schema update

2016-06-13 Thread Damien Krotkine
My advice : use the fetch-update loop that you already know. You won't get better performance or reliability by using a MapReduce. I understand that your values are indexed in Search. So either use a search query and update them all, or you could also do a list_keys in stream mode, and updated