After complaints that my CGI takes too long, I traced it down to the
redirect() function.
Why does it take so long?
Is there an alternative?
Jonathan
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
On Fri, 12 Jul 2002 16:46:48 +0300, [EMAIL PROTECTED] (Octavian Rasnita)
wrote:
>Hi all,
>
>Please tell me if what I want is possible or not.
>
>I want to make a Perl script that will prompt the visitor to download more
>files, not only one, for example 3 files.
>Of course, after selecting them,
A common mistake is to print a header and then do a redirect, which slows
the process down because essentially you have a script and a page competing
to generate a header. When using redirect(), do not do this:
#!/usr/bin/perl
use CGI;
my $q = new CGI;
print $q->he
If I print the header() first, then I end up with the 'redirect()' code
being printed to
the browser. So this is not what I have been doing.
But I think I might have a clue, tell me if I am right.
The CGI is being called form an HTML Form that uses Post,
After processing the info, I use redirect(
Do you think the clients browser goes and Posts the form again to the new
URL?
No, I don't think that is what's happening.
Once you post the info, the script is processing
it and doing the redirect, but I don't think it is
posting the entire string of nv pairs agai
On Sun, 14 Jul 2002 13:04:49 -0400, [EMAIL PROTECTED] (Zentara)
wrote:
>I don't think you can do this easily. The http protocol is setup so that
Forget that, I got industrious this afternoon, and looked into how to do
it. Here it is. It works great, sends 1 file right after the other.
download-
Hello All,
I am new to this list, so if this not the proper list to send this too, I would
appreciate the name of the appropriate list.
I need to run thru various folders to check the disc usage within each folder. I
know I could do the command `du -sh /path/to/foldr`; but was wonder if Perl ha