Adam, How your server is going to load things depends entirely upon the server and its configuration. CGI is slow, ISAPI is faster, mod_perl and FastCGI are faster still. Which one you're using and how you use it will dramatically change things. On the other hand, if you only have a few thousand hits a day, it's probably not worth worrying about.
As programmers get more experienced, the learn one of the most important rules of programming: do NOT worry about performance unless you have an extremely good reason to do so. Build your systems to be correct and complete and only after you have a known performance issue should you worry about that. Even then, profile the system to find out what the issue is. For example, if it's your database connection or complicated SQL queries which are causing the problem, fiddling with nested foreach loops probably isn't going to help that much. The "don't worry about performance at first" concept is one that many programmers balk at, but once you adopt it, it makes life much, much easier. Cheers, Ovid -- If this message is a response to a question on a mailing list, please send follow up questions to the list. Web Programming with Perl -- http://users.easystreet.com/ovid/cgi_course/ ----- Original Message ---- From: Adam Waite <[EMAIL PROTECTED]> To: beginners-cgi@perl.org Sent: Thursday, June 15, 2006 4:43:56 AM Subject: Re: Multiple .cgi scripts vs. one large script Moore, George T. wrote: > It depends on how you are using your scripts. The most "expensive" > aspect of the files is the IO used to read them from the hard drive they > reside on. If you are calling on the scripts multiple times and they > have to be read each time, rather than being cached in memory, then you > only want to read what is absolutely necessary. If one script always > calls another then you are probably better having the subroutines, which > would save IO. I'll be more specific about my setup. I have two scripts running a bulletin-board type thing. One of them is responsible for displaying all of the posts at once, or just displaying one at a time. The other script handles replying to posts or submitting new posts. I'd estimate that the script responsible for looking at the posts gets used more often, since more people look than actually say anything. It would be easier to maintain this program if I made these two functions subroutines of one larger script, and then called them using some switch logic. My main question then, is this. If a user visits the script, does the server only load one instance of the program per visit, or does each page change necessitate a new instance to begin? Because if only one process is loaded per visit (for instance, the program is compiled once and then used over and over while the user stays within pages covered by that program), it makes more sense to combine them into one larger program. However, if the whole program must be loaded by the server each time the user visits a new page (the program must be compiled and run for each request) then it makes sense to only load what is necessary by splitting up the script into several smaller ones. As may be obvious by now, I do not know very much about how servers handle requests for cgi programs, so I'm sorry if this question is posed in a nonsensical way. Thanks, Adam -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response> -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] <http://learn.perl.org/> <http://learn.perl.org/first-response>