At 3:04 AM -0500 5/30/03, Bingrui Foo wrote:
I'm wondering in freeBSD, if I have a directory with 10,000 files, or
maybe even 100,000 files, each about 5 kb long. Wondering will reading and
writing to any one of these files in C be affected by the sheer number of
these files? Will the access time be affected significantly?

Just wondering because not sure whether I should put these data in a
database or just use files with unique names.

Also will separating the files into many directories help?

Looking up .../x/12/34/56 can be done in logarithmic time (i.e., look up
.../x/12, then .../x/12/34, then .../x/12/34/56); looking up .../y/123456 (unless some optimization has been added) will require a linear scan
through the directory. In short, don't go there...


-r
--
email: [EMAIL PROTECTED]; phone: +1 650-873-7841
http://www.cfcl.com/rdm    - my home page, resume, etc.
http://www.cfcl.com/Meta   - The FreeBSD Browser, Meta Project, etc.
http://www.ptf.com/dossier - Prime Time Freeware's DOSSIER series
http://www.ptf.com/tdc     - Prime Time Freeware's Darwin Collection
_______________________________________________
[EMAIL PROTECTED] mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to "[EMAIL PROTECTED]"

Reply via email to