I've found conflicting information about this.
modperl: Is there a performance impact to breaking up a single 350k file (monolithic package) into multiple files (still the same package name)?
I guess I'm asking: Are larger files inherently slower, even when pre-compiled? (Obviously, I'm not a computer science student)
Here's the situation:
Over time, what was a 60k package has grown to 350k, encompassing many subs that do more or less, 3 mutually exclusive operations. I want to remove 70% of the subs and put them into their own files for maintenace/organization reasons, but is there a performance benefit or hit by doing this? The main file will basically direct, requireing the additional files only when necessary. But under mod perl, I believe the require will only happen once (per process) anyway.
Was:
package foo
[a,b,c] (~350k)
Now:
package foo
[a] (~75k)
require [b] (~115k) only when necessary
require [c] (~160k) only when necessary
Thank you for enlightenment or a reference link, in advance.
W
- breaking packages into smaller files for performance? Will Fould