New submission from Amir Szekely: tempfile._get_default_tempdir() tries to find a good temporary directory by attempting to create a file and write a string into it for all candidate directories. It deletes those files right after closing them. But if writing rather than creating them fails, the files get left behind. This can happen, for example, when the disk is full on Linux. It will let you create the file with open() and even write() to it. But as soon as close() is called, OSError is raised with errno 28 (No space left on device) and unlink() is never called.
On our system (Linux 2.6.32, ext3), after filling the disk by mistake, we suddenly found dozens of random empty files scattered across the file system. The candidate list, where they were all found, even includes the root directory so the issue is quite visible. We noticed this problem on Python 2.4.4, but I have been able to reproduce it with today's tip. A possible solution would be wrapping everything between open() and unlink() in a try-finally block. This way, no matter what happens, the file will get deleted. I have attached a patch implementing this solution and a new test case that reproduces the issue. Verified with `make test && make patchcheck`. ---------- components: Extension Modules files: fix_tempfile_leaving_files_behind.patch keywords: patch messages: 178362 nosy: kichik priority: normal severity: normal status: open title: tempfile._get_default_tempdir() leaves files behind when HD is full type: resource usage versions: Python 2.6, Python 2.7, Python 3.1, Python 3.2, Python 3.3, Python 3.4, Python 3.5 Added file: http://bugs.python.org/file28464/fix_tempfile_leaving_files_behind.patch _______________________________________ Python tracker <rep...@bugs.python.org> <http://bugs.python.org/issue16800> _______________________________________ _______________________________________________ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com