I would probably stream keys to the client and count them there (it's the
most efficient method I can think of)

If you have node.js installed, do this:

npm install riak-js@latest
node -e "require('riak-js').getClient({ port: 8098 }).count('bucket');"


2011/11/21 Stephen Bennett <st...@bennettweb.org>

> I have a bucket which contains images refernced by a key which is made up
> from a guid. I have a number of servers in my cluster and my bucket is set
> up to store 3 versions of every item in the bucket across the servers in
> the cluster. I'd like to understand a little bit more about how my cluster
> is performing in terms of data storage. I can find out how much space each
> bitcask is currently taking up on each server, but I'd like to compare
> reference this against the number of unique keys that are being stored in
> the system.
>
> I've tried to use map-reduce methods using the erlang methods defined in
> the riak_kv_mapreduce, calling them against the HTTP interface but my
> queries are timing out. I've tried to extend the timeout, but it's still
> timing out.
>
> What's the most efficient way to find out how many keys exist in a
> particular bucket?
> _______________________________________________
> riak-users mailing list
> riak-users@lists.basho.com
> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
>
>
_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to