Hi,

I'd like my app to stream a large number of events into Cassandra that 
originate from the same network input stream. If I create one batch mutation, 
can I just keep appending events to the Cassandra batch until I'm done, or are 
there some practical considerations about doing this (e.g. too much stuff 
buffering up on the client or server side, visibility of the data within the 
batch that hasn't been closed by the client yet)? Barring any discussion about 
atomicity, if I were able to stream a largish source into Cassandra, what would 
happen if the client crashed and didn't close the batch? Or is this kind of 
thing just a normal occurrence that Cassandra has to be aware of anyway?

Cheers,

Ben

Reply via email to