"Bruce" <[EMAIL PROTECTED]> wrote: > >A little late but.. thanks for the replies, was very useful. Here`s >what I do in this case: > >def search(a_dir): > valid_dirs = [] > walker = os.walk(a_dir) > while 1: > try: > dirpath, dirnames, filenames = walker.next() > except StopIteration: > break > if dirtest(dirpath,filenames): > valid_dirs.append(dirpath) > return valid_dirs > >def dirtest(a_dir): > testfiles = ['a','b','c'] > for f in testfiles: > if not os.path.exists(os.path.join(a_dir,f)): > return 0 > return 1 > >I think you`re right - it`s not os.walk that makes this slow, it`s the >dirtest method that takes so much more time when there are many files >in a directory. Also, thanks for pointing me to the path module, was >interesting.
Umm, may I point out that you don't NEED the "os.path.exists" call, because you are already being HANDED a list of all the filenames in that directory? You could "dirtest" with this much faster routinee: def dirtest(a_dir,filenames): for f in ['a','b','c']: if not f in filenames: return 0 return 1 -- - Tim Roberts, [EMAIL PROTECTED] Providenza & Boekelheide, Inc. -- http://mail.python.org/mailman/listinfo/python-list