Thanks Pieter for giving quick reply.

I have downloaded  the tar ball. And have changed the limits.conf as per
the documentation like below.

* soft nofile 32768
* hard nofile 32768
root soft nofile 32768
root hard nofile 32768
* soft memlock unlimited
* hard memlock unlimited
root soft memlock unlimited
root hard memlock unlimited
* soft as unlimited
* hard as unlimited
root soft as unlimited
root hard as unlimited

root soft/hard nproc 32000



Some reason with in less than an hour cassandra node is opening 32768 files
and cassandra is not responding after that.

It is still not clear why cassadra is opening that many files and not
closing properly ( does the laest cassandra 2.0.1 version have some bugs ).

what i have been experimenting is 300 writes per sec and 500 reads per sec.

And i have using 2 node cluster with 8 core cpu and 32GB RAM ( Virtuval
Machines)


Do we need to increase the nofile limts to more than 32768 ?
















On Thu, Nov 7, 2013 at 4:55 PM, Pieter Callewaert <
pieter.callewa...@be-mobile.be> wrote:

>  Hi Murthy,
>
>
>
> Did you do a package install (.deb?) or you downloaded the tar?
>
> If the latest, you have to adjust the limits.conf file
> (/etc/security/limits.conf) to raise the nofile (number of files open) for
> the cassandra user.
>
>
>
> If you are using the .deb package, the limit is already raised to 100 000
> files. (can be found in /etc/init.d/cassandra, FD_LIMIT).
>
> However, with the 2.0.x I had to raise it to 1 000 000 because 100 000 was
> too low.
>
>
>
> Kind regards,
>
> Pieter Callewaert
>
>
>
> *From:* Murthy Chelankuri [mailto:kmurt...@gmail.com]
> *Sent:* donderdag 7 november 2013 12:15
> *To:* user@cassandra.apache.org
> *Subject:* Getting into Too many open files issues
>
>
>
> I have experimenting cassandra latest version for storing the huge the in
> our application.
>
> Write are doing good. but when comes to reads i have obsereved that
> cassandra is getting into too many open files issues. When i check the logs
> its not able to open the cassandra data files any more before of the file
> descriptors limits.
>
>   Can some one suggest me what i am going wrong what could be issues
> which causing the read operating leads to Too many open files issue.
>

Reply via email to