Hello,

On my site I paginate query results by limiting rows output to a
value, say LIMIT, and then the 2nd, 3rd pages run the same query with
$skip=LIMIT, $skip=(LIMIT*2) value posted back.  I use the following
code to "skip" these result rows, which is just fetching the next row
to an unused array.

//if there are rows to skip
if ($result_rows > $rows_to_skip) {
   while ( $rows_to_skip ) {
      // eat a row
      mysql_fetch_array($result);
      $rows_to_skip--;
      $total_results_shown++;
   }
}

Can I make this more efficient?  Is there a way to eliminate this data
before it leaves the mySQL server (and would it be faster)?

If it makes any difference the average result row is probably around 40-50
bytes but some of these queries can return up to 850 rows..

Steve
-- 
[EMAIL PROTECTED] ** http://mrclay.org


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to