Just remember if you are not running AAE, you will want to do a manual repair
on that partition, in order to ensure you have adequate replication:
https://docs.riak.com/riak/kv/2.2.3/using/repair-recovery/repairs.1.html#repairing-partitions
> On Sep 4, 2019, at 5:19 AM, Bryan Hunt
> wrote:
>
Bryan* (misspelled your name), need a morning coffee :(
On 04/09/2019 10:29, Guido Medina wrote:
Thanks a lot Brian, I have deleted such directory and now the node
started, let's see how it behaves.
Guido.
On 04/09/2019 10:19, Bryan Hunt wrote:
Easiest solution is to just delete the vnode - a
Thanks a lot Brian, I have deleted such directory and now the node
started, let's see how it behaves.
Guido.
On 04/09/2019 10:19, Bryan Hunt wrote:
Easiest solution is to just delete the vnode - and let it recover from
replicas - vnode directory will be
844930634081928249586505293914101120738
Easiest solution is to just delete the vnode - and let it recover from replicas
- vnode directory will be 844930634081928249586505293914101120738586001408
> On 4 Sep 2019, at 10:01, Guido Medina wrote:
>
> Hi all,
>
> We had a power cut which caused one of the nodes to corrupt one of the
>