Don't overkill it with technology, you could use Riak with a simple 2i index (integer index YYYYMMDD for the message date so you can search day by day backward), and for the message sequence or identifier you could either user ANY SQL database sequence or a UUID generator.

HTH,

Guido.

On 22/10/12 10:04, Rapsey wrote:

On Mon, Oct 22, 2012 at 10:29 AM, Shashwat Srivastava <dark...@gmail.com <mailto:dark...@gmail.com>> wrote:


    Now, each bucket would have conversation between two users or of a
    room of a site. The conversation rate for (some) rooms is very
    high, some 20,000 - 30,000 messages per hour. We
    have observed that users usually don't access conversations past
    one week. So, if a bucket has conversation of 3 years, then mostly
    users would access the recent conversation upto a week or month.
    Can riak handle this easily? Also, would riak use RAM wisely in
    this scenario? Would it only keep keys and indexes, corresponding
    to recent messages per bucket, in RAM?


Leveldb backend should.

    Finally, what is the best approach for creating keys in a bucket?
    Earlier, I was planning to use timestamp (in milliseconds). But in
    a room there can be multiple messages at the same time. As I
    understand I cannot have a unique incremental message id per
    bucket (as riak has write capability in all nodes in a cluster so
    consistency is not guareented). Please correct me if I am wrong.
    One other way could be to let riak generate key and I use
    timestamp as a secondary index. But this seems to be a bad design.
    Also, what would be the best way to achieve pagination for this
    use case?


You could use redis for incremental id's.


Sergej


_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to