On 12Jun2018 15:00, Bill Deegan <b...@baddogconsulting.com> wrote:
I'm doing some refactoring on a fairly large python codebase.
Some of the files are > 4000 lines long and contain many classes.

Should I expect any performance hit from splitting some of the classes out
to other files?

In general, nothing significant. Yes, your programs will load more files than before. But only once per invocation.

If you really have concerns here you would need to measure the change, not just in a stub "import old_module and quit" vs "import new module in pieces and quit" but against a larger program doing significant work. The former will let you measure how big the impact generally is, but the latter will tell you if it matters.

Also, where does the cost go? In reading the .py code, or in compiling the code to whatever internal form your Python interpreter runs against?

If the (time) cost is greater for the compilation versus the opening a few files, then you might actually win with the breakup if most programmes do not need to import _all_ the library pieces.

I think I'm saying: don't worry unless your applications are very time critical (eg invoked very frequently and/or doing almost nothing after the "import" phase) or you notice a significant slowdown after your changes. And it is usually easier to stick things back together than it was to pull them apart if that happens.

Cheers,
Cameron Simpson <c...@cskk.id.au>
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to