> On Wednesday, August 1, 2012 at 5:47 PM, Jared Morrow wrote:
>
>> You will need to make the adjustments in the /etc/security/limits.conf file
>> as described here http://wiki.basho.com/Open-Files-Limit.html
>>
>> -Jared
>>
>>
>> On Aug 1, 201
de the period (.) that
> is above as well.
>
> Reid
>
>
> On Aug 1, 2012, at 5:00 PM, John Roy wrote:
>
>> Hi --
>>
>> Riak 1.1.1
>> three nodes
>> Ubuntu 10.04.1 LTS
>> downtime means one node drops off then the other two follow s
de.
>
> On Wed, Aug 1, 2012 at 1:42 PM, John Roy wrote:
> I'm seeing significant downtime on Riak now. Much like the "Riak Crashing
> Constantly" thread. However in this case we get a "Too many open files"
> error, and also "contains pointer that i
y nodes?
> * Which OS?
> * When you say "downtime" do you mean the entire cluster? Or just a subset of
> your nodes?
>
> Mark
>
> On Wed, Aug 1, 2012 at 1:42 PM, John Roy wrote:
> I'm seeing significant downtime on Riak now. Much like the "Riak Crash
ulimit is 4096.
Here's the limit in sysctl:
usr/sbin# sysctl fs.file-max
fs.file-max = 2413423
On Aug 1, 2012, at 1:49 PM, Vlad Gorodetsky wrote:
> What's your ulimit?
> You should try doing something like "sudo ulimit -n 1024" or go with
> permanent solution via sysctl, as far as I remember.
>
I'm seeing significant downtime on Riak now. Much like the "Riak Crashing
Constantly" thread. However in this case we get a "Too many open files" error,
and also "contains pointer that is greater than the total data size." See the
error messages below for more details.
If others have an idea
So here's what I can get for the code from mochi_web:parse_header:
%% @spec parse_header(string()) -> {Type, [{K, V}]}
%% @doc Parse a Content-Type like header, return the main Content-Type
%% and a property list of options.
parse_header(String) ->
%% TODO: This is exactly as broken as
03965>,sink,undefined}},{log,sink},{trace,{set,1,16,16,8,80,48,{[],[],[],[],[],[],[],[],[],[],[],[],[],[],[],[]},{{[],[],[error],[],[],[],[],[],[],[],[],[],[],[],[],[]],64}}]}]
>
>
> But like I said, it's just a hunch.
>
> Cheers,
> Erik Hoogeveen
>
> On 25 jul 2012, at 20:11, John Roy wrote:
>
Hi --
I'm seeing an issue with timeouts for map/reduces. We're running erlang files
via a curl command, as
part of a haskell job. In the curl data we specify the timeout to be one hour
(3,600,000 milliseconds --
see the example below). However, the job crashes (times out) after well less
tha
.
Thanks again,
John
On Jun 5, 2012, at 2:27 PM, Bob Ippolito wrote:
> On Tue, Jun 5, 2012 at 1:22 PM, John Roy wrote:
> I saw something similar from an individual who was using javascript
> (http://lists.basho.com/pipermail/riak-users_lists.basho.com/2011-July/004843.html),
> but I
I saw something similar from an individual who was using javascript
(http://lists.basho.com/pipermail/riak-users_lists.basho.com/2011-July/004843.html),
but I don't think I have the same problem in erlang. Any help is greatly
appreciated.
I'm consistently getting the following error from my cu
11 matches
Mail list logo