Am 2024-06-15 07:48, schrieb Gordon Bergling:
Hi Craig,On Fri, Jun 14, 2024 at 10:26:57PM -0700, Craig Leres wrote:On 6/14/24 22:18, Gordon Bergling wrote: > I just upgraded a 13.3-RELEASE system to 14.1-RELEASE, which has > two ZFS pools, z for ZFS-on-root and storage for asorted stuff. While > a 'pkg upgrade' fails, I discovered something strange with the free > disk space. [...] > In general I don't have any clue where the 16.5G are allocated. > > Any hints how to get the space back? I have deleted all snapshots, but the available space > is still 0B.I ran into this after upgrading a small vultr vm; I couldn't install one last large package because there wasn't enough free space. In my case it was due to a bunch of old boot environment snapshots. I don't know what happens if you delete the related snapshots without using "bectl" but it still mightbe worth looking at BECTL(8)Thanks for the hint on bectl(8). After using 'bectl destroy' for deleting the remaining snapshots / boot environments I got 9GB free space. I don't know why there aren't listed with 'zfs list -t snapshot', but the space is back and Ican finish the update to 14.1-RELEASE.
Check "zfs list -t all", this gives a better overview. # bectl list BE Active Mountpoint Space Created 2024-05-13-095022 - - 1.80G 2024-05-14 12:44 2024-05-17-070639 - - 967M 2024-05-18 12:32 2024-05-27-130054 - - 153M 2024-05-29 12:33 2024-06-06-091808 NR / 10.2G 2024-06-06 09:32 # zfs list -t all -r mpool | grep ROOTmpool/ROOT 10.2G 758G 96K none mpool/ROOT/2024-05-13-095022 2.86M 758G 2.82G / mpool/ROOT/2024-05-17-070639 2.88M 758G 3.07G / mpool/ROOT/2024-05-27-130054 2.36M 758G 3.22G / mpool/ROOT/2024-06-06-091808 10.2G 758G 3.23G / mpool/ROOT/2024-06-06-091808@2023-07-26-21:26:20 3.95G - 4.19G - mpool/ROOT/2024-06-06-091808@2024-05-18-10:32:27 1.80G - 2.82G - mpool/ROOT/2024-06-06-091808@2024-05-29-12:33:26 964M - 3.09G - mpool/ROOT/2024-06-06-091808@2024-06-06-09:32:11-0 150M - 3.22G -
When I delete the 2024-05-13 BE (including the origin via "bectl destroy -o xxx"), it will free up the 2.8M from the pool, and the 1.8G from the snapshot. And it seems I forgot the origin-snapshot for my 2023-26-21 BE, which has 3.9G...
# zfs destroy mpool/ROOT/2024-06-06-091808@2023-07-26-21:26:20 # zfs list -t all -r mpool | grep ROOTmpool/ROOT 6.25G 762G 96K none mpool/ROOT/2024-05-13-095022 2.86M 762G 2.82G / mpool/ROOT/2024-05-17-070639 2.88M 762G 3.07G / mpool/ROOT/2024-05-27-130054 2.36M 762G 3.22G / mpool/ROOT/2024-06-06-091808 6.24G 762G 3.23G / mpool/ROOT/2024-06-06-091808@2024-05-18-10:32:27 1.80G - 2.82G - mpool/ROOT/2024-06-06-091808@2024-05-29-12:33:26 964M - 3.09G - mpool/ROOT/2024-06-06-091808@2024-06-06-09:32:11-0 150M - 3.22G -
# bectl destroy -o mpool/ROOT/2024-05-13-095022 # zfs list -t all -r mpool | grep ROOTmpool/ROOT 4.44G 764G 96K none mpool/ROOT/2024-05-17-070639 2.88M 764G 3.07G / mpool/ROOT/2024-05-27-130054 2.36M 764G 3.22G / mpool/ROOT/2024-06-06-091808 4.44G 764G 3.23G / mpool/ROOT/2024-06-06-091808@2024-05-29-12:33:26 989M - 3.09G - mpool/ROOT/2024-06-06-091808@2024-06-06-09:32:11-0 150M - 3.22G -
Bye, Alexander. -- http://www.Leidinger.net alexan...@leidinger.net: PGP 0x8F31830F9F2772BF http://www.FreeBSD.org netch...@freebsd.org : PGP 0x8F31830F9F2772BF
signature.asc
Description: OpenPGP digital signature