Hi Justin,

Please can you paste in your vm.args as well

Thanks,

Richard



On 24 November 2013 10:45, Justin Long <justinl...@outlook.com> wrote:

> Hello everyone,
>
> Our Riak cluster has failed after what seems to be an issue in LevelDB.
> Noticed that a process running a segment compact has started to throw
> errors non-stop. I opened a Stack Overflow question here where you will
> find a lot of log data:
> http://stackoverflow.com/questions/20172878/riak-is-throwing-failed-to-compact-like-crazy
>
> Here is exactly what we're getting in console.log:
>
> 2013-11-24 10:38:46.803 [info]
> <0.19760.0>@riak_core_handoff_receiver:process_message:99 Receiving handoff
> data for partition
> riak_search_vnode:1050454301831586472458898473514828420377701515264
> 2013-11-24 10:38:47.239 [info]
> <0.19760.0>@riak_core_handoff_receiver:handle_info:69 Handoff receiver for
> partition 1050454301831586472458898473514828420377701515264 exited after
> processing 5409 objects
> 2013-11-24 10:38:49.743 [error] emulator Error in process <0.19767.0> on
> node 'riak@192.168.3.3' with exit value:
> {badarg,[{erlang,binary_to_term,[<<260
> bytes>>],[]},{mi_segment,iterate_all_bytes,2,[{file,"src/mi_segment.erl"},{line,167}]},{mi_server,'-group_iterator/2-fun-0-',2,[{file,"src/mi_server.erl"},{line,722}]},{mi_server,'-group_iterator/2-fun-1-'...
>
>
> 2013-11-24 10:38:49.743 [error] <0.580.0>@mi_scheduler:worker_loop:141
> Failed to compact <0.11868.0>:
> {badarg,[{erlang,binary_to_term,[<<131,104,3,109,0,0,0,25,99,111,108,108,101,99,116,111,114,45,99,111,108,108,101,99,116,45,116,119,105,116,116,101,114,109,0,0,0,14,100,97,116,97,95,102,111,108,108,111,119,101,114,115,109,0,0,128,203,123,34,105,100,115,34,58,91,49,54,50,51,53,50,50,50,50,51,44,49,55,51,55,51,52,52,50,44,49,50,56,51,52,52,56,55,51,57,44,51,57,56,56,57,56,50,51,52,44,49,52,52,55,51,54,54,57,53,48,44,53,48,48,55,53,57,48,55,44,52,51,56,49,55,53,52,56,53,44,49,51,54,53,49,50,49,52,50,44,52,54,50,52,52,54,56,51,44,49,48,55,57,56,55,49,50,48,48,44,55,55,48,56,51,54,55,57,44,50,56,51,56,51,57,55,56,44,49,57,50,48,55,50,55,51,48,44,51,57,54,57,56,56,57,56,55,44,50,56,48,50,54,51,56,48,52,44,53,57,50,56,56,53,50,51,48,44,49,50,52,55,53,56,57,53,55,56,44,49,55,51,56,56,51,53,52,50,44,49,53,56,57,54,51,50,50,50,48,44,53,53,49,51>>],[]},{mi_segment,iterate_all_bytes,2,[{file,"src/mi_segment.erl"},{line,167}]},{mi_server,'-group_iterator/2-fun-0-',2,[{file,"src/mi_server.erl"},{line,722}]},{mi_server,'-group_iterator/2-fun-1-',2,[{file,"src/mi_server.erl"},{line,725}]},{mi_server,'-group_iterator/2-fun-0-',2,[{file,"src/mi_server.erl"},{line,722}]},{mi_server,'-group_iterator/2-fun-1-',2,[{file,"src/mi_server.erl"},{line,725}]},{mi_server,'-group_iterator/2-fun-0-',2,[{file,"src/mi_server.erl"},{line,722}]},{mi_segment_writer,from_iterator,4,[{file,"src/mi_segment_writer.erl"},{line,110}]}]}
>
>
>
>
>
>
> The log is just full of them. Thanks for your help! We need to get this
> cluster back up ASAP, appreciated!
>
> - Justin
>
> _______________________________________________
> riak-users mailing list
> riak-users@lists.basho.com
> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
>
>
_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to