Re: Riak significant downtime

2012-08-01 Thread John Roy
> On Wednesday, August 1, 2012 at 5:47 PM, Jared Morrow wrote: > >> You will need to make the adjustments in the /etc/security/limits.conf file >> as described here http://wiki.basho.com/Open-Files-Limit.html >> >> -Jared >> >> >> On Aug 1, 201

Re: Riak significant downtime

2012-08-01 Thread John Roy
de the period (.) that > is above as well. > > Reid > > > On Aug 1, 2012, at 5:00 PM, John Roy wrote: > >> Hi -- >> >> Riak 1.1.1 >> three nodes >> Ubuntu 10.04.1 LTS >> downtime means one node drops off then the other two follow s

Re: Riak significant downtime

2012-08-01 Thread John Roy
de. > > On Wed, Aug 1, 2012 at 1:42 PM, John Roy wrote: > I'm seeing significant downtime on Riak now. Much like the "Riak Crashing > Constantly" thread. However in this case we get a "Too many open files" > error, and also "contains pointer that i

Re: Riak significant downtime

2012-08-01 Thread John Roy
y nodes? > * Which OS? > * When you say "downtime" do you mean the entire cluster? Or just a subset of > your nodes? > > Mark > > On Wed, Aug 1, 2012 at 1:42 PM, John Roy wrote: > I'm seeing significant downtime on Riak now. Much like the "Riak Crash

Re: Riak significant downtime

2012-08-01 Thread John Roy
ulimit is 4096. Here's the limit in sysctl: usr/sbin# sysctl fs.file-max fs.file-max = 2413423 On Aug 1, 2012, at 1:49 PM, Vlad Gorodetsky wrote: > What's your ulimit? > You should try doing something like "sudo ulimit -n 1024" or go with > permanent solution via sysctl, as far as I remember. >

Riak significant downtime

2012-08-01 Thread John Roy
I'm seeing significant downtime on Riak now. Much like the "Riak Crashing Constantly" thread. However in this case we get a "Too many open files" error, and also "contains pointer that is greater than the total data size." See the error messages below for more details. If others have an idea

Re: MR Error

2012-07-27 Thread John Roy
So here's what I can get for the code from mochi_web:parse_header: %% @spec parse_header(string()) -> {Type, [{K, V}]} %% @doc Parse a Content-Type like header, return the main Content-Type %% and a property list of options. parse_header(String) -> %% TODO: This is exactly as broken as

Re: Riak 1.1.1 and eleveldb map/reduce timeouts

2012-07-27 Thread John Roy
03965>,sink,undefined}},{log,sink},{trace,{set,1,16,16,8,80,48,{[],[],[],[],[],[],[],[],[],[],[],[],[],[],[],[]},{{[],[],[error],[],[],[],[],[],[],[],[],[],[],[],[],[]],64}}]}] > > > But like I said, it's just a hunch. > > Cheers, > Erik Hoogeveen > > On 25 jul 2012, at 20:11, John Roy wrote: >

Riak 1.1.1 and eleveldb map/reduce timeouts

2012-07-25 Thread John Roy
Hi -- I'm seeing an issue with timeouts for map/reduces. We're running erlang files via a curl command, as part of a haskell job. In the curl data we specify the timeout to be one hour (3,600,000 milliseconds -- see the example below). However, the job crashes (times out) after well less tha

Re: mochijson2 error from curl mapreduce

2012-06-05 Thread John Roy
. Thanks again, John On Jun 5, 2012, at 2:27 PM, Bob Ippolito wrote: > On Tue, Jun 5, 2012 at 1:22 PM, John Roy wrote: > I saw something similar from an individual who was using javascript > (http://lists.basho.com/pipermail/riak-users_lists.basho.com/2011-July/004843.html), > but I

mochijson2 error from curl mapreduce

2012-06-05 Thread John Roy
I saw something similar from an individual who was using javascript (http://lists.basho.com/pipermail/riak-users_lists.basho.com/2011-July/004843.html), but I don't think I have the same problem in erlang. Any help is greatly appreciated. I'm consistently getting the following error from my cu