Terry Hancock wrote: > The user with write access would run the script, causing the pyc files > to be generated for that interpreter. Then a normal user, running an > older Python tries to load the modules. Since a .pyc file exists, it gets > used instead, but *oops* it's for a later version of the interpreter and > stuff breaks. > > A better solution than getting rid of the pyc files would be to put good > ones there --- use the version of python that users are expected to be > using and generate them. If you delete the pyc files, you create an > unnecessary drag on performance and the hazard remains to mess you > up again. If the pyc files are generated, though, I *think* they will be > used and work for both the expected python and (fingers crossed) the > later version.
Sorry Terry, but both assumptions are wrong. Different versions of Python store different "magic numbers" in the .pyc files, and they will not use a .pyc file with a wrong number. I believe you'll actually get an error about "bad magic number" if you do manage to force a bad .pyc to be loaded (probably by having no matching .py from which to recompile). If the .py does exist, it will be recompiled and the newly generated bytecode will be used, whether or not the .pyc file can be written to cache it for next time. Your suggestion about pre-generating the .pyc files (using compileall) is a good one in general for this sort of setup (shared libraries), though it really won't help if there are different versions of Python in use. (If that's true, nothing will really help except perhaps PEP304 and the time machine. Well, having the two versions use two different copies of the library files would help, but the OP doesn't think there are different versions in use.) -Peter -- http://mail.python.org/mailman/listinfo/python-list