On Fri, Jul 9, 2010 at 5:18 PM, Brandon High <bh...@freaks.com> wrote:

> I think that DDT entries are a little bigger than what you're using. The
> size seems to range between 150 and 250 bytes depending on how it's
> calculated, call it 200b each. Your 128G dataset would require closer to
> 200M (+/- 25%) for the DDT if your data was completely unique. 1TB of unique
> data would require 600M - 1000M for the DDT.
>

Using 376b per entry, it's 376M for 128G of unique data, or just under 3GB
for 1TB of unique data.

A 1TB zvol with 8k blocks would require almost 24GB of memory to hold the
DDT. Ouch.

-B

-- 
Brandon High : bh...@freaks.com
_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to