> Hi!
> 
> Yes, you are near of the truth :). But if I use some complicated joining 
> query to few large tables I think this is not too comfortable for the 
> server.
> 
> 

Well, this is the line between mere programming and software
engineering.

The simplest technique is the most straightforward -- trade memory for
time

my $data = $dbh->selectall_arrayref( q ( select *
                                         from table1
                                         join table2
                                         where complex_function() ) ,
                                     { Columns => {} } );


This is one of my All Time Favorite idioms .. gives you a data
structure of the form:

$data = [ { first_column => 'aubergine' ,
            second_column => 'baklazhan' } ,
          { first_column => 'hello',
            second_column => 'privyet' } ];

Another technique would be (if your database supports it) is push a
PART of the load off to the db server.  

$dbh->do ( q ( select into temp table _results ...complex-sql.... ) );
my $sth = $dbh->prepare( q ( select * from _results ) );
if ( my $cursor = $sth->execute ) {
    while (my $row = $sth->fetchrow_hashref) {
        .
        .
        .
    }
}

Of course, this just trades the KNOWN quantity of "straightforward
memory for time" for a relatively unknown tradeoff on the database
server.

-- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- 
        Lawrence Statton - [EMAIL PROTECTED] s/aba/c/g
Computer  software  consists of  only  two  components: ones  and
zeros, in roughly equal proportions.   All that is required is to
sort them into the correct order.

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>


Reply via email to