I am writing a Cgi script to retrieve records and split it into pages.

my $limit = $q->param('limit') || '0';
my $results_per_page = 50;

my $query = $dbh -> prepare("SELECT * FROM $table WHERE reply=0 ORDER BY msg
DESC LIMIT $limit, $results_per_page");
$query->execute();
while (my @row = $query -> fetchrow_array()) {
print "$row[1]<br>$row[2]<br>$row[3] blah blah blah.....";
}
$query->finish();
&print_number_of_pages;
----------------------------------------------

The table holds about 50,000 rows of records, The script is supposed to be
accessed 100,000 times a day by web users. After reading manual for LIMIT, I
came to know that LIMIT works after getting the whole matrix of records, so
it's not going to reduce any load on server, so I am using it only for
splitting records on to pages.

----------------------------------------------
My Questions:

1 - Is it safe to load such huge data 100,000 times a day? Or there is a way
that I can load only those records which are going to be printed on each
page?

2 - How much data in terms of size (MB) can be holded safely by my @row =
$query -> fetchrow_array())

3- What else can I do to reduce load on mySQL, PERL and server? Should I use
CGI::Fast?

4- Anyother possible query?

Thanks,
Sara.



-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to