If you still have the vm setup, you could just try to boot the text
installer media again, enter the rescue mode and see how it wants to
mount the existing raid disks (not degraded, both present) in the vm.
That's where it used to always happen. Oherwise, only occasionally
during normal reboots.

> When trying to boot with one disk missing, the system would not boot

Right, after way too many release cycles with the unmaintained mdadm
modifications in ubuntu, I moved upstream.

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/531240

Title:
  silently breaking raid: root raid_members opened as luks

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu-release-notes/+bug/531240/+subscriptions

-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to