I'm making a small interface for copying large groups of files around a filesystem. I have a progressbar that counts the items in the destination, and increments as each new file is copied over. It compares this number to the number of files in the source and updates accordingly.
All is fine and dandy with an average amount of files (<20000), but when the amount of files to be copied becomes large, I end up getting "Maximum recurssion depth exceeded" errors. I found out I could find an safe recursion limit per system, and then set the recursion limit using sys.setrecursionlimit(), but its still not giving me the depth I would like. I can also make it work by slowing the update speed way down, but it just looks clumsy and even then, the top limit could be 10 times the amount of data I'm testing with now...maybe even 200GB. I don't really know how slow I would need it to update to make sure to stay below the limit. Is there a way to get around recursion limits? Help! ~half.italian ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ def incrementProgress(self, window, workorder): dest_count = workorder.copyIndex self.label.config(text=workorder.movingFile) src_count = len(workorder.contents) if self.p >= 100: window.destroy() # # RESET self.p so future moves work!! self.p = 0 return # check for an empty workorder folder...it's already been moved # if workorder.contents == []: window.destroy() self.p = 0 return self.p = (float(dest_count)/float(src_count))*100 print "Percentage copied:", self.p self.progressBar.updateProgress(self.p) time.sleep(.1) self.incrementProgress(window, workorder) -- http://mail.python.org/mailman/listinfo/python-list