Thanks Jens for the comment. Actually I am using Cassandra Stress Tool and
this is the tools who inserts such a large statements.

But do you mean that inserting columns with large size (let's say a text
with 20-30 K) is potentially problematic in Cassandra? What shall i do if I
want columns with large size?

best,
/Shahab

On Sun, Oct 5, 2014 at 6:03 PM, Jens Rantil <jens.ran...@tink.se> wrote:

> Shabab,
> If you are hitting this limit because you are inserting a lot of (CQL)
> rows in a single batch I suggest you split the statement up in multiple
> smaller batches. Generally, large inserts like this will not perform very
> well.
>
> Cheers,
> Jens
>
> —
> Sent from Mailbox <https://www.dropbox.com/mailbox>
>
>
> On Fri, Oct 3, 2014 at 6:47 PM, shahab <shahab.mok...@gmail.com> wrote:
>
>> Hi,
>>
>> I am getting the following warning in the cassandra log:
>> " BatchStatement.java:258 - Batch of prepared statements for [mydb.mycf]
>> is of size 3272725, exceeding specified threshold of 5120 by 3267605."
>>
>> Apparently it relates to the (default) size of prepared  insert statement
>> . Is there any way to change the default value?
>>
>> thanks
>> /Shahab
>>
>
>

Reply via email to