Re: Multiple .cgi scripts vs. one large script

2006-06-15 Thread Ovid
Adam,

How your server is going to load things depends entirely upon the server and 
its configuration.  CGI is slow, ISAPI is faster, mod_perl and FastCGI are 
faster still.  Which one you're using and how you use it will dramatically 
change things.  On the other hand, if you only have a few thousand hits a day, 
it's probably not worth worrying about.

As programmers get more experienced, the learn one of the most important rules 
of programming:  do NOT worry about performance unless you have an extremely 
good reason to do so.  Build your systems to be correct and complete and only 
after you have a known performance issue should you worry about that.  Even 
then, profile the system to find out what the issue is.  For example, if it's 
your database connection or complicated SQL queries which are causing the 
problem, fiddling with nested foreach loops probably isn't going to help that 
much.

The "don't worry about performance at first" concept is one that many 
programmers balk at, but once you adopt it, it makes life much, much easier.

Cheers,
Ovid
 
-- If this message is a response to a question on a mailing list, please send 
follow up questions to the list.
 
Web Programming with Perl -- http://users.easystreet.com/ovid/cgi_course/

- Original Message 
From: Adam Waite <[EMAIL PROTECTED]>
To: beginners-cgi@perl.org
Sent: Thursday, June 15, 2006 4:43:56 AM
Subject: Re: Multiple .cgi scripts vs. one large script

Moore, George T. wrote:
> It depends on how you are using your scripts. The most "expensive"
> aspect of the files is the IO used to read them from the hard drive they
> reside on. If you are calling on the scripts multiple times and they
> have to be read each time, rather than being cached in memory, then you
> only want to read what is absolutely necessary. If one script always
> calls another then you are probably better having the subroutines, which
> would save IO. 

I'll be more specific about my setup.  I have two scripts running a 
bulletin-board type thing.  One of them is responsible for displaying 
all of the posts at once, or just displaying one at a time.  The other 
script handles replying to posts or submitting new posts.  I'd estimate 
that the script responsible for looking at the posts gets used more 
often, since more people look than actually say anything.

It would be easier to maintain this program if I made these two 
functions subroutines of one larger script, and then called them using 
some switch logic.

My main question then, is this.  If a user visits the script, does the 
server only load one instance of the program per visit, or does each 
page change necessitate a new instance to begin?  Because if only one 
process is loaded per visit (for instance, the program is compiled once 
and then used over and over while the user stays within pages covered by 
that program), it makes more sense to combine them into one larger 
program.

However, if the whole program must be loaded by the server each time the 
user visits a new page (the program must be compiled and run for each 
request) then it makes sense to only load what is necessary by splitting 
up the script into several smaller ones.

As may be obvious by now, I do not know very much about how servers 
handle requests for cgi programs, so I'm sorry if this question is posed 
in a nonsensical way.

Thanks,
Adam

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 







-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
 




Re: Multiple .cgi scripts vs. one large script

2006-06-15 Thread Bill Stephenson

On Jun 14, 2006, at 6:40 PM, Hardly Armchair wrote:


Hello All,

I was wondering if it is more efficient (in terms of speed and 
processor load) to have two different scripts of approximately the 
same size called to handle two different functions, or to have one 
large script handle all cgi functions using subroutines.  Or perhaps 
these situations are equivalent.


I asked a similar question a few months back ("How big is too big?").

After learning a lot from the responses and where they led me I started 
looking more at CGI:: Application.


The general theory I get from this framework (as it applies to your 
question) is that to help with management of subroutines you should 
create scripts (modules) that hold subroutines that perform similar 
tasks. No more than 10 subroutines in a script was the rule of thumb as 
I recall.


Someone here mentioned that a Perl/CGI script that contains 1000 lines 
is probably about as big as you'd want one to get. The script I'm 
re-factoring to use CGI:: Application is now over 10,000 lines (with 
comments). It still performs pretty well, but never sees huge amounts 
of requests.


I completely agree with Ovid's comment, "do NOT worry about performance 
unless you have an extremely good reason to do so." That's one reason 
my script got so big. Performance still is not an issue for me, but 
management is becoming one.


The "One big one versus many small ones" question seems best answered 
by personal preference, up to a point. For me, management was getting 
to be a pain.


Now I'd strongly recommend CGI:: Application to anyone working on a 
perl/cgi app that will get bigger than that 1000 line max that was 
previously suggested or needs features easily provided by the framework 
and its plug-ins.


Kindest Regards,

--
Bill Stephenson
417-546-8390


--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]