Moore, George T. wrote:
It depends on how you are using your scripts. The most "expensive"
aspect of the files is the IO used to read them from the hard drive they
reside on. If you are calling on the scripts multiple times and they
have to be read each time, rather than being cached in memory, then you
only want to read what is absolutely necessary. If one script always
calls another then you are probably better having the subroutines, which
would save IO.
I'll be more specific about my setup. I have two scripts running a
bulletin-board type thing. One of them is responsible for displaying
all of the posts at once, or just displaying one at a time. The other
script handles replying to posts or submitting new posts. I'd estimate
that the script responsible for looking at the posts gets used more
often, since more people look than actually say anything.
It would be easier to maintain this program if I made these two
functions subroutines of one larger script, and then called them using
some switch logic.
My main question then, is this. If a user visits the script, does the
server only load one instance of the program per visit, or does each
page change necessitate a new instance to begin? Because if only one
process is loaded per visit (for instance, the program is compiled once
and then used over and over while the user stays within pages covered by
that program), it makes more sense to combine them into one larger
program.
However, if the whole program must be loaded by the server each time the
user visits a new page (the program must be compiled and run for each
request) then it makes sense to only load what is necessary by splitting
up the script into several smaller ones.
As may be obvious by now, I do not know very much about how servers
handle requests for cgi programs, so I'm sorry if this question is posed
in a nonsensical way.
Thanks,
Adam
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>