Uri Guttman <u...@stemsystems.com> writes:

> On 05/13/2017 09:08 PM, lee wrote:
>> Hi,
>>
>> would you say this acceptable or something that should be forbidden?
>>
>>
>> my $sth_cmds = $dbh->prepare_cached($sql);
>> my @params;
>> push(@params, undef) foreach(0..12);
>> $sth_cmds->execute();
>> $sth_cmds->bind_columns(eval join(', ', map('\$params[' . $_ . ']', 
>> 0..$#params)));
>>
>>
>> I haven't used 'eval' before, and this seems inherently dangerous.
>
> you are correct in that string eval is very dangerous. my rule is you
> never use it until you learn when not to use it! :)

How do you learn that unless you use it?

> you likely want to use the dbi calls that return a hash for each row
> you select. i always forget the name but it is something like
> fetchall_hashrows. you can look it up easily. it is generally the
> nicest way to get rows from dbi IMO.

That depends on what you want to do: IIRC, I was using this to prepare
data for Chart::Bars.  That requires (references to) arrays containing
the data you want to have plotted in the chart, each array representing
one set of data.  Since the data is in columns in the database, you need
to transform the columns into suitable rows:


database:
val1a, val2a, val3a, ...
val1b, val2b, val3b, ...
...


Chart::Bars:
val1a, val1b, val1c, ...
val2a, val2b, val2c, ...
...


That isn't too clear in the documentation and can be rather confusing
until you get the idea.

You would need something like a "multi hash" for this, i. e. a hash in
which each key refers to multiple elements in the form of a row (like an
array) rather than the usual 1 to 1 hash that has each key referring to
a single value (like a reference to such an array).  That still leaves
you with having to do the transformation and no advantage to using
hashes that I could see.


The fetchall_something functions return, IIRC, a reference to an array
containg references to arrays of the values you want to fetch.  That
always gives me trouble with dereferencing to finally get the values.

They are potentially dangerous because you might fetch huge numbers of
rows and run into memory issues which could bring down the server.  I'm
avoiding them unless they have a significant advantage and when I can be
sure not to fetch too many rows, or when I would have all the data in
the program anyway.

In all instances that allow it, I process one row after the other, and I
try to do as much as possible with queries before that.  That you can do
that is one of the nice things of using a database :)

You only need to be careful with some queries because it can be
tremendously faster to use (temporary) tables and several queries, doing
things in several steps, rather than a convoluted query that does it all
in one step.


-- 
"Didn't work" is an error.

-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to