Lowell Alleman <lowel...@gmail.com> added the comment: I've ran into the same problem before. I've found that due to differences between the way Unix and Windows handles files (inodes vs file handles), this problem is more apparent on Windows, but it isn't handled 100% correctly on Unix systems either.
I think the root of the problem is that there is nothing in the code to handle multiple concurrent processes writing to a single log file; and I'm not sure there is a simple fix for this. I tried several easy solutions to this problem by retrying failed file renames and re-opening closed files, but I ultimately discovered all all such approaches are inadequate and can actually result in losing old log files too (in the worse-case scenario). I final got frustrated enough to just take the time to write my own. It is based on the built-in one and aims to be a "drop-in" replacement. I use file locking to safely write to a single log file from multiple python processes concurrently. If you would like to give it a try, it is located here: http://pypi.python.org/pypi/ConcurrentLogHandler I agree that it would be nice for the built in logging handlers to do this for you, but in the mean time this may be option for you. ---------- nosy: +lowell87 _______________________________________ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue4749> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com