I am definitely planning to split the images into directories by size and that will at least divide the number by a factor of the various sizes (but on the higher end this could still be between 150 - 175 thousand images which is still a pretty big number. I don't know if this will be a problem or not or there is really anything to worry about at all - but it is better to obtain advice from those that have been there, done that - or are at least a bit more familiar with pushing limits on Unix resources than to wonder whether it will work.
Regards, David
On Monday, March 28, 2005, at 07:18 PM, Kane wrote:
I ran into a similar situation with a massive directory of PIL generated images (around 10k). No problems on the filesystem/Python side of things but other tools (most noteably 'ls') don't cope very well. As it happens my data has natural groups so I broke the big dir into subdirs to sidestep the problem.
-- http://mail.python.org/mailman/listinfo/python-list
-- http://mail.python.org/mailman/listinfo/python-list