http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

On Sat, Aug 1, 2015 at 12:00 PM, <riak-users-requ...@lists.basho.com> wrote:

> Send riak-users mailing list submissions to
>         riak-users@lists.basho.com
>
> To subscribe or unsubscribe via the World Wide Web, visit
>         http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
> or, via email, send a message with subject or body 'help' to
>         riak-users-requ...@lists.basho.com
>
> You can reach the person managing the list at
>         riak-users-ow...@lists.basho.com
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of riak-users digest..."
>
>
> Today's Topics:
>
>    1. Riak CS Node Shutdown Because it Ran Out Data Space -
>       Unbalanced With   Other Nodes in Cluster (Valenti, Anthony)
>    2. Riak Recap - July 31, 2015 (Matthew Brender)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Fri, 31 Jul 2015 17:51:26 +0000
> From: "Valenti, Anthony" <anthony.vale...@inmar.com>
> To: "riak-users@lists.basho.com" <riak-users@lists.basho.com>
> Subject: Riak CS Node Shutdown Because it Ran Out Data Space -
>         Unbalanced With Other Nodes in Cluster
> Message-ID:
>         <d4bd4f58047b47459f5e4a5039269adcac6c1...@corp-mbx2.inmar.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> We have a Riak CS node that ran out of space on the data mount and the
> service shutdown.  This is strange since the other nodes in the cluster
> seem to be about 76%-78% full.  Why is it so out of balance and how can it
> be fixed?
>
> Node - Riak07 (failed)
> Filesystem                1K-blocks      Used Available Use% Mounted on
> /dev/mapper/vgroot-lvroot  21103116   3553252  16480772  18% /
> udev                       16464744         4  16464740   1% /dev
> tmpfs                       3294768       284   3294484   1% /run
> none                           5120         0      5120   0% /run/lock
> none                       16473828         0  16473828   0% /run/shm
> /dev/mapper/vgroot-lvdata 642647464 642580056     67408 100% /data
> /dev/sda1                    472036     60968    386697  14% /boot
> /dev/mapper/vgroot-lvhome   7688360    266548   7031260   4% /home
>
> All other nodes
> Node - Riak01
> Filesystem                1K-blocks      Used Available Use% Mounted on
> /dev/mapper/vgroot-lvroot  21196680   2156104  17971484  11% /
> udev                       16466476         4  16466472   1% /dev
> tmpfs                       6590228       288   6589940   1% /run
> none                           5120         0      5120   0% /run/lock
> none                       16475564         0  16475564   0% /run/shm
> /dev/mapper/vgroot-lvdata 628271104 486828480 141442624  78% /data
> /dev/sda1                    484068     49921    409776  11% /boot
> /dev/mapper/vgroot-lvhome   7785688    446688   6948448   7% /home
>
> Node - Riak02
> Filesystem                1K-blocks      Used Available Use% Mounted on
> /dev/mapper/vgroot-lvroot  21200776   2401784  17729696  12% /
> udev                       16466476         4  16466472   1% /dev
> tmpfs                       6590228       280   6589948   1% /run
> none                           5120         0      5120   0% /run/lock
> none                       16475564         0  16475564   0% /run/shm
> /dev/mapper/vgroot-lvdata 628275200 489217444 139057756  78% /data
> /dev/sda1                    484068     49921    409776  11% /boot
> /dev/mapper/vgroot-lvhome   7785688    410856   6984280   6% /home
>
> Node - Riak03
> Filesystem                1K-blocks      Used Available Use% Mounted on
> /dev/mapper/vgroot-lvroot  21200776   2304152  17827328  12% /
> udev                       16466476         4  16466472   1% /dev
> tmpfs                       6590228       272   6589956   1% /run
> none                           5120         0      5120   0% /run/lock
> none                       16475564         0  16475564   0% /run/shm
> /dev/mapper/vgroot-lvdata 628275200 490374972 137900228  79% /data
> /dev/sda1                    484068     49921    409776  11% /boot
> /dev/mapper/vgroot-lvhome   7785688    410912   6984224   6% /home
>
> Node - Riak08
> Filesystem                1K-blocks      Used Available Use% Mounted on
> /dev/mapper/vgroot-lvroot  22135340   2838828  18175004  14% /
> udev                       16464744         4  16464740   1% /dev
> tmpfs                       3294768       280   3294488   1% /run
> none                           5120         0      5120   0% /run/lock
> none                       16473828         0  16473828   0% /run/shm
> /dev/mapper/vgroot-lvdata 642647464 487758140 154889324  76% /data
> /dev/sda1                    472036     60968    386697  14% /boot
> /dev/mapper/vgroot-lvhome   7688360    266484   7031324   4% /home
>
> Thanks,
> Anthony
>
> ********************************************
>
> Inmar Confidentiality Note:? This e-mail and any attachments are
> confidential and intended to be viewed and used solely by the intended
> recipient.? If you are not the intended recipient, be aware that any
> disclosure, dissemination, distribution, copying or use of this e-mail or
> any attachment is prohibited.? If you received this e-mail in error, please
> notify us immediately by returning it to the sender and delete this copy
> and all attachments from your system and destroy any printed copies.? Thank
> you for your cooperation.
>
> Notice of Protected Rights:? The removal of any copyright, trademark, or
> proprietary legend contained in this e-mail or any attachment is prohibited
> without the express, written permission of Inmar, Inc.? Furthermore, the
> intended recipient must maintain all copyright notices, trademarks, and
> proprietary legends within this e-mail and any attachments in their
> original form and location if the e-mail or any attachments are reproduced,
> printed or distributed.
>
> ********************************************
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/attachments/20150731/1fad8707/attachment-0001.html
> >
>
> ------------------------------
>
> Message: 2
> Date: Fri, 31 Jul 2015 19:57:00 -0400
> From: Matthew Brender <mbren...@basho.com>
> To: riak-users <riak-users@lists.basho.com>
> Subject: Riak Recap - July 31, 2015
> Message-ID:
>         <CAKrbaM2ywShy_c2HEow+031LxH2giXxDtEBKTXR_T8S4KHV=
> 8...@mail.gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> Hey everyone! As always, there's been some great knowledge sharing over the
> user lists. Here is your biweekly Recap!
>
> Note that many of the "unanswered" are brand new and I know we'll get to
> answering them. I just want you all to know we keep track of these.
>
> ## Announcements
> * There's a new Java client in town! 2.0.2 was released [0]
>
> ## Recently Answered
> * Roman asks about using a FQDN and is guided to use 0.0.0.0 instead [1]
> * The Riak C client ran into a segfault that Chris explained [2]
> * Humberto found setting ulimit on OS X is a little tricky [3]
> * I gave Sevilha some resources to spin up Riak locally [4]
> * Zeeshan makes sense of indexing Riak Data Types for Marius [5]
>
> ## New and Unanswered
> * Damien needs some help thinking through ring-resizing on a live cluster
> [6]
> * Sean gets an error while deleting an object through the Erlang client [7]
> * Humberto could still use help with ulimit settings on OS X [8]
> * Nick is helping Johan with a known Erlang bug in need of a patch [9]
> * Damien sees 2i writes when he isn't using 2i. Any idea why? [10]
> * Amao sees an outbound handoff error and still needs some help [11]
>
> ## All the Meetups!
> We've been busy on the meetup scene, hosting events in Portland, Austin and
> Chicago. If you'd like to see where groups are, we have a new URL to do so
> [12]. Let me know if you'd like to run one in your city!
>
> ## What'd we miss?
> Have you brought Riak into production at a new place? Did you find a blog
> or presentation helpful? Good! Share it in our release notes:
> https://github.com/basho-labs/the-riak-community/tree/master/release-notes
>
> Have a great weekend,
> Matt Brender
> Developer Advocate @ Basho
> @mjbrender
>
> [0]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017374.html
> [1]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017370.html
> [2]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017366.html
> [3]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017379.html
> [4]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017385.html
> [5]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017390.html
> [6]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017364.html
> [7]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017368.html
> [8]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017383.html
> [9]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017377.html
> [10]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017386.html
> [11]
>
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/2015-July/017387.html
> [12] http://www.meetup.com/pro/basho/
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.basho.com/pipermail/riak-users_lists.basho.com/attachments/20150731/b10b1e8c/attachment-0001.html
> >
>
> ------------------------------
>
> Subject: Digest Footer
>
> _______________________________________________
> riak-users mailing list
> riak-users@lists.basho.com
> http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com
>
>
> ------------------------------
>
> End of riak-users Digest, Vol 73, Issue 1
> *****************************************
>
_______________________________________________
riak-users mailing list
riak-users@lists.basho.com
http://lists.basho.com/mailman/listinfo/riak-users_lists.basho.com

Reply via email to