OUps,

I forget another strategy I used :
instead of having
testp2(es employee[])

you can use
testp2( names text[], salaries integer[], ages integer[])

This might be the solution with the less work, but it is absolutely
terrible practice,
because it will be hard to change you record type (evolution difficult)
, and having many columns will make you create function with many arguments,
which is often a bad idea.

Cheers,
Rémi-C

2015-06-02 10:36 GMT+02:00 Rémi Cura <remi.c...@gmail.com>:

> Hey,
> the only straight workaround I know (which is pretty bad)
> is to cast down your record to text.
> Then you have an array of text, which is manageable.
>
> For this you can either 'flatten' your record into a unique text,
> or cast each part of your record to text, then emulate an array of array
> (you need to know the length of the inner array in your function though).
>
> I used this to emulate a 2D numpy vector (N*3)(for numpy).
>
> You'll need a custom aggregate, like this one
> <https://github.com/Remi-C/PPPP_utilities/blob/master/postgres/array_of_array.sql>
> .
>
> The other more sane solution is to pass the information about the row you
> want to retrieve, and retrieve the row directly within the python.
> For instance, here you would pass an array of id of the employee you want
> to work with.
> This is saner, but as a design I don't really like to have specific SQL
> code into a generic python function.
>
> I agree it is cumbersome, and I also badly miss more powerful input for
> python function (after all, plpython can already return composite types,
> which is awesome)
>
>
> Cheers,
> Rémi-C
>
> 2015-06-02 2:44 GMT+02:00 Adrian Klaver <adrian.kla...@aklaver.com>:
>
>> On 06/01/2015 07:42 AM, Filipe Pina wrote:
>>
>>> Thanks for the reply anyway, it's a pity though, it'd be useful..
>>>
>>> Another bump I've found along the pl/python road: insert ROWTYPE in
>>> table..
>>> Maybe you have some hint on that? :)
>>>
>>> So, in PLPGSQL I can:
>>>
>>> DECLARE
>>>    my_var my_table;
>>> BEGIN
>>>    my_var.col1 := 'asd';
>>>    INSERT INTO my_table VALUES(my_table.*);
>>> END;
>>>
>>> How would I do something like that in pl/python?
>>>
>>> First, how to declare a ROW-TYPE variable, as they're all python
>>> mappings?
>>>
>>> my_var = { 'col1': 'asd' } enough? it'd would miss all the other
>>> columns...
>>>
>>> Second, how to insert it?
>>>
>>> plpy.prepare and .execute say they don't support composite types, so I
>>> cannot simply pass
>>>
>>> pl = plpy.prepare('INSERT INTO core_customer VALUES ($1)', ['my_table'])
>>>
>>> Any workarounds for this? (meaning I wouldn't have to specify any
>>> columns in the insert statement)
>>>
>>
>> http://www.postgresql.org/docs/9.4/interactive/sql-insert.html
>>
>> pl = plpy.prepare('INSERT INTO core_table SELECT * FROM my_table')
>>
>>
>>> Thanks
>>>
>>> On Sex, Mai 29, 2015 at 2:00 , Peter Eisentraut <pete...@gmx.net> wrote:
>>>
>>>> On 5/18/15 10:52 AM, Filipe Pina wrote:
>>>>
>>>>     But one of the functions I need to create needs to accept an array
>>>>     of records.
>>>>
>>>> PL/Python doesn't support that. Some more code needs to be written to
>>>> support that. You did everything correctly. I don't know of a good
>>>> workaround.
>>>>
>>>
>>
>> --
>> Adrian Klaver
>> adrian.kla...@aklaver.com
>>
>>
>>
>> --
>> Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
>> To make changes to your subscription:
>> http://www.postgresql.org/mailpref/pgsql-general
>>
>
>

Reply via email to