This is something I've wondered about for a while. I know that theoretically Python is supposed to auto-recognize duplicate imports; however I've run into problems in the past if I didn't arrange the imports in a certain way across multiple files. As a result, I worry about conflicts that arise because something has been imported twice. So...I'm not sure if Python *always* gets this correct.
Also, I understand what you're saying about the organization of files based on modules and maybe regrouping based on use. However I like the organization of my files to be a grouping of logical components in my application. This makes it easy to keep things separated and keeps files from getting to long (which takes longer to program because you're always bouncing up and down large files). As a result, if I have to worry about grouping by shared modules then it makes that more difficult. I think it would great to have access to a file (like the __init__.py file for packages) which all the files in the same directory would have access to for common imports. That way you could push out the repeated imports and also clean up the look a little bit as well. On Jan 7, 11:53 am, Paul McGuire <pt...@austin.rr.com> wrote: > ...and don't worry about a possible performance issue of importing os > (or any other module) multiple times - the Pythonimportmanager is > smart enough to recognize previously importedmodules, and wontimport > them again. > > If a module uses the os module, then it shouldimportit - that's just > it. > > Another consideration might be that you are breaking up your own > programmodulestoo much. For instance, if I had a program in which I > were importing urllib in lots ofmodules, it might indicate that I > still have some regrouping to do, and that I could probably gather all > of my urllib dependent code into a single place. > > -- Paul -- http://mail.python.org/mailman/listinfo/python-list