On 5/29/2010 12:22 AM, schatten wrote:
Okay.

I had/have a running snv134 install on one half of my disk. I created a zfs 
(zfs create rpool/VB) for my virtualbox. Then zfs set 
mountpoint=/export/home/schatten/VirtulBox rpool/VB. Then a reboot and it hangs 
right before the login should appear.
I removed the zfs with an OSOL livecd and booting works.

Then I tried to add the other half of my disk.
First formatting it, then zpool create c5d1p0 (not sure exactly, but zpool list 
showed the other half as up and running). Reboot, same as above.

OK, let me get this straight:

(1) Your boot disk has a Solaris fdisk partition that takes up 50% of the actual disk space (2) Inside that fdisk partition, you have b134 installed, with the zpool being the default 'rpool'
(3)  There exists the following zfs filesystems:
            rpool/export
            rpool/export/home
            rpool/export/home/schatten
(4)  You do a 'zfs create rpool/VB'
(5)  You then do 'zfs set mountpoint=/export/home/schatten/VB'
(6) Everything works fine then, up until you reboot the system, after which is hangs before displaying the GDM login screen.

Right?



Also, you are NOT going to be able to use the 2nd fdisk partition on your boot drive - OpenSolaris only recognizes 1 Solaris fdisk partition per drive at this point. It will recognize more than one Solaris *slice* inside an fdisk partition, though.


--
Erik Trimble
Java System Support
Mailstop:  usca22-123
Phone:  x17195
Santa Clara, CA

_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to