David Boyes wrote: > Umm, not on the OSes I mentioned. If you fstat the file or read the > directory inode with Unix compatibility on, the underlying OS reads the > file once to determine the actual file size in bytes to fill into the > file stat structure in order to be compatible with the assumption that > files are streams of bytes. You then get to do it again to get the > actual data blocks. > > Reading a 20 TB file twice is nontrivial. I have LOTS of files that > large, and a few that will grow into exabyte-scale in the not too > distant future. > > I think we should just exclude those from consideration. What exactly, can we back up a exabyte file to? A billion CD's!
If you must use bacula, for your (very much smaller 20Tb files) then do nothing On those systems, you don't have a filesize for the reasons given. But you already don't a have a filesize, so adding a size field to the catalogue changes nothing. Set it to zero or the number of blocks or something. --John ------------------------------------------------------------------------- This SF.Net email is sponsored by the Moblin Your Move Developer's challenge Build the coolest Linux based applications with Moblin SDK & win great prizes Grand prize is a trip for two to an Open Source event anywhere in the world http://moblin-contest.org/redirect.php?banner_id=100&url=/ _______________________________________________ Bacula-devel mailing list Bacula-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-devel