Hi all,
I've a large dataset in Riak (about 20 million keys), storing JSON
documents. I'd like to update those documents to remove a JSON attribute.
What's the best way to approach this problem in Riak?
Thanks.
--
Ricardo Mayerhofer
___
riak-users mail
/bitcask/50239118783249787813251666124688006726811648/76.bitcask.data","/apps/riak/lib/bitcask/50239118783249787813251666124688006726811648/77.bitcask.data","/apps/riak/lib/bitcask/50239118783249787813251666124688006726811648/78.bitcask.data"],[]}
> in
Hi,
I've recently been told that Riak will no longer be supported on
Solaris/illumos based distributions. At the same time ZFS was recommended which
I find a bit strange since ZFS comes from Solaris/illumos and is still the most
tested of the platform. There are also Riak probes for DTrace.
Ca
My advice : use the fetch-update loop that you already know. You won't
get better performance or reliability by using a MapReduce.
I understand that your values are indexed in Search. So either use a
search query and update them all, or you could also do a list_keys in
stream mode, and updated