[GENERAL] Advice on structure /sequence / trigger

2005-06-16 Thread David Pratt
I am interested in having some internationalization capability to an 
application.  i18 will take care of the interface but not the data. To 
internationalize the data, I am wanting to put the internationalized 
text strings in a multi_language table like this:


CREATE TABLE multi_language (
id   SERIAL,
 ml_idINTEGER NOT NULL,
language_id   INTEGER NOT NULL,
language_text TEXT NOT NULL
);

Other tables would have be joined by ml_id (the multi language id).  
For example:


CREATE TABLE example_table (
idSERIAL,
name_ml_id INTEGER NOT NULL,

So in example_table, name_ml_id would join ml_id so you have the same 
ml_id in multi_language table for more than one language. So there 
would be two records in multi_language for a record in example_table if 
you had an english translation and  french translation.


I want to add records to multi_language sequentially.  So lets say I 
add a new example in example_table, I want to see what the last value 
that was added to multi_language was so that if would use the next in 
the sequence.  As you can see by the structure the id field is serial 
and does this but I am speaking of the ml_id field specifically.  Let's 
say I have one example record in example_table, multi_language would 
look like this


1, 1, 1, the brown cow   # english translation of name - language 1 (en)
2, 1, 2,  la vache brun# french translation of name - language 2 
(fr)


ml_id for both record is 1.

So when I create a second record example_table, I want to have this:

1, 1, 1, the brown cow   # english translation of name (of example 
record - language 1 (en)
2, 1, 2, la vache brun# french translation of name (of example 
record- language 2 (fr)
3, 2, 1, the blue turkey  #english translation of name (second record - 
language 1(en)
4, 2, 2, la dandon bleu  #french translation of name (second record - 
language 2 (fr)


How best to do this? Would I create a separate sequence for 
multi_language ml_id and do a select on it to get the next value before 
inserting each multi_language record.  Should this be done using a 
trigger - if so how? Should this be done in my application code and not 
sql or would that be dangerous.  For example, the multi_language table 
will be used a lot. What if a couple of people were creating new 
records at the same time. If I were using python and doing this in my 
application code, I am wondering if there could be problems. With a 
trigger it would be transactional, correct? Can you have a trigger work 
from incrementing a sequence instead of updating a table?


I just want to get this right because it will be an important part of 
what I am preparing. Sorry for the really long message but I don't know 
if  it would make any sense if I did not fully explain what i am 
wanting to do. I am not french so excuse my sample translations...


---(end of broadcast)---
TIP 3: if posting/reading through Usenet, please send an appropriate
 subscribe-nomail command to [EMAIL PROTECTED] so that your
 message can get through to the mailing list cleanly


Re: [GENERAL] Advice on structure /sequence / trigger

2005-06-30 Thread David Pratt
Hi Greg.  Sorry for getting back to you so late on this. I think your 
idea on the design is spot on since it will give me referential 
integrity with my other and the multi-language will just be a simple 
two field table with id and multi-dimensional array of language codes 
to string.  Super idea!  Even if the array gets larger it is not a big 
issue since postgres can easily handle it.


Regards,
David

On Friday, June 17, 2005, at 12:15 AM, Greg Stark wrote:



David Pratt <[EMAIL PROTECTED]> writes:

I just want to get this right because it will be an important part of 
what I am
preparing. Sorry for the really long message but I don't know if  it 
would make
any sense if I did not fully explain what i am wanting to do. I am 
not french

so excuse my sample translations...


FWIW I started with a design much like this. I threw it out when I 
started
having to actually use it and found it just entirely unworkable. It 
was bad
enough that every single query had to do a subquery for every single 
text
field but the first time I actually had to *load* the data I realized 
just how

much work every single insert, delete, and update was going to be...

I ended up storing every text field as a text[] and assigning each 
language an
index into the array. This only works because in my application 
everything
will have precisely the same (small) set of languages available. If 
you have a
large variety of languages and each string will be available in a 
varying
subset then this model might not work as well. It did require a bit of 
extra
work handling arrays since my language driver doesn't do handle them 
directly.


I can't make a principled argument for this being the "right" model 
but it's
working well in practice for me and I can't see the fully normalized 
model
ever working out well. One thing that worries me is that neither the 
array
feature set nor the i18n feature set is stable yet and future changes 
might

break my code. But I think it'll be workable.

--
greg



---(end of broadcast)---
TIP 8: explain analyze is your friend


Re: [GENERAL] Transparent i18n?

2005-06-30 Thread David Pratt
Hi Steve.  I have been a bit puzzling over a similar issue - not i18 
for interface but for text data and trying to sort out a solution so I 
will be interested to hear additional advice as well.  When I wrote to 
the list a couple of weeks back (look for my posting around the 17th) I 
was looking at doing something with a normalized structure but now I 
don't think this is going to work that well.  It was suggested that I 
look at an array.  I am looking at a multidimensional array to do this. 
 I am just reading up on postgres support for arrays.


I think my table will be pretty simple;
CREATE TABLE multi_language (
id  SERIAL,
lang_code_and_textTEXT[][]
);

So records would look like:

1, {{'en','the brown cow'},{'fr','la vache brun'}}
2, {{'en','the blue turkey'},{'fr','la dandon bleu'}}

I have another table with language codes ie en, fr, etc.  When 
languages are added, I would just append to array for whole table.  The 
trouble for me is more of getting the data out in postgres because 
retrieving the raw array will be incompatible syntax for python and I 
would have to manipulate results. Quite frankly I want for this to be 
done in Postgres so I only have to retrieve query results.  If I cant 
it would be a pain unless I can think of something else because the 
issue is going to be the keys and values in my languages table working 
with the array.


For example, if I have a serial table containing my languages and add 2 
entries english and french, I would then have two elements in my array 
and it wouldn't be so bad because I could use the id as a key to get 
the value back out through a query.  But say I delete french (and 
remove second element in entries for my table) and add spanish, now I 
have an language id of 3 and two elements in my array that can't call 
each other properly.  In python, arrays are called dictionaries and you 
can easily grab the an element doing something like 
lang_code_and_text['en'] to get the value of the en (english) key.


I was hoping you could call the multi-language text out of the array 
with a text key instead of a numeric index
but it appears Postgres will only let you do it this way or get results 
from slices as far as I can tell.  Maybe someone else on the list has 
some advice to offer here.


ie.

SELECT language_text[1][1] AS language_code,
 language[1][2] AS text
FROM language_text;


Regards,
David

---(end of broadcast)---
TIP 3: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


Re: [GENERAL] Transparent i18n?

2005-07-02 Thread David Pratt

Many thanks Karsten for some insight into how you are handling this.

Regards,
David


On Saturday, July 2, 2005, at 06:08 AM, Karsten Hilbert wrote:


SELECT language_text[1][1] AS language_code,
 language[1][2] AS text
FROM language_text;


They way we do that in GNUmed:

 select lookup_val, _(lookup_val) from lookup_table where ...;

If you want to know how see here:

  
http://savannah.gnu.org/cgi-bin/viewcvs/gnumed/gnumed/gnumed/server/ 
sql/gmI18N.sql?rev=1.20&content-type=text/vnd.viewcvs-markup


Feel free to ask for clarification.

Karsten
--
GPG key ID E4071346 @ wwwkeys.pgp.net
E167 67FD A291 2BEA 73BD  4537 78B9 A9F9 E407 1346

---(end of  
broadcast)---

TIP 5: Have you checked our extensive FAQ?

   http://www.postgresql.org/docs/faq



---(end of broadcast)---
TIP 4: Don't 'kill -9' the postmaster


Re: [GENERAL] Transparent i18n?

2005-07-02 Thread David Pratt
Hi Greg.  Well I'm kind of half way but I think what I am doing could 
work out.


I have an iso_languages table, a  languages table for languages used  
and a multi_language table
for storing values of my text fields.  I choose my language from 
iso_languages. Any table that needs a
multi_language field gets one by id with referential integrity with a 
multi_language table id since this is a direct
relationship.  Thanks for the idea of using array BTW. Referential 
integrity  could not work with my first model.


I am taking the array text and parsing the result in python to get the 
key positions.  This is possible
with a query using string_array function and getting text from any 
multi_language field. Then I put
result into a dictionary (an array) and get length and add one to get 
new key value that is added
when a new language is added. Using this key an array is added to 
existing array to each row of multi_language
table (in lang_code_and_text) field. So the length of the main array in 
multi-demensional array grows by one array

for the language for each record in multilanguage table.

I can also seek the english (en) value so that I will be able to use 
english as default  text for the new language
and inserting  a new array for that language into the 
lang_code_and_text array.  For example, if spanish (es)
added the new key is 3 so insert for each record so have something like 
this now:


1, {{'en','the brown cow'},{'fr','la vache brun'},{'es','the brown 
cow'}}
2, {{'en','the blue turkey'},{'fr','la dandon bleu'},{'es','the blue 
turkey'}}


In my forms, I am using a template to display entry fields for each 
language used. The english
will be default for new languages added so there is something in these 
fields to start with and it should
update properly based on correct key values. In my languages table, I 
am storing the current key positions for
each language used in my app. I have an i18 layer for zope  and based 
on language code I will

pass language id so you see right language in interface and data both.

When updating or deleting records, I am will be making a trigger to 
remove the array that represents a
translation after update. Then it has to update my language table to 
provide updated key values for my
languages. I am working on my first functions and triggers in plpgsql.  
This is where I may need help from the

list if I get stuck but so far so good!

Well so far so go but not finished yet. Does anyone have any comments 
on scalability.  I don't really see
a problem since there really is not any risk of my needing any more 
than 10 - 15  languages or so max out of maybe
300 languages in the world.  I think 15 entries in an array is very 
small so can't see any reason for this not to

work well.




I think my table will be pretty simple;
CREATE TABLE multi_language (
id  SERIAL,
lang_code_and_textTEXT[][]
);

So records would look like:

1, {{'en','the brown cow'},{'fr','la vache brun'}}
2, {{'en','the blue turkey'},{'fr','la dandon bleu'}}


That's a lot more complicated than my model.

Postgres doesn't have any functions for handling arrays like these as
associative arrays like you might want. And as you've discovered it's 
not so
easy to ship the whole array to your client where it might be easier 
to work

with.



Yes. This is a bit complicating since if they were there it would be 
really

nice to work with arrays.



I just have things like (hypothetically):

CREATE TABLE states (
  abbrevtext,
  state_nametext[],
  state_capitol text[]
)

And then in my application code data layer I mark all 
"internationalized
columns" and the object that handles creating the actual select 
automatically

includes a "[$lang_id]" after every column in that list.

The list of languages supported and the mapping of languages to array
positions is fixed. I can grow it later but I can't reorganize them. 
This is

fine for me since pretty much everything has exactly two languages.


That is pretty cool.  The only advantage in what I am doing will have 
is that you
will be able to add languages at any time and there will be no huge 
load on postgres
as far as I can tell since multilanguage table is a table is only two 
fields and one record for each
multi-language field referenced  from my other other tables and calls 
to it are

direct by id.  I think this should work but it is a puzzler for sure!

Regards,
David

---(end of broadcast)---
TIP 9: In versions below 8.0, the planner will ignore your desire to
  choose an index scan if your joining column's datatypes do not
  match


[GENERAL] Trigger help

2005-07-03 Thread David Pratt
Basically I want this trigger to work after a language record in my 
languages table is added.


CREATE TRIGGER language_add_trig AFTER INSERT ON languages
FOR EACH ROW EXECUTE PROCEDURE trigger_language_add();

Here is my function but it is not working.  I am wanting to loop for 
each record in my multi_language
table, grab the english text from each record and append and array to 
multi dimensional array in
lang_code_and_text field in same record.  I am new to triggers so help 
appreciated debugging or

correcting my syntax.

Thanks,
David

CREATE FUNCTION trigger_language_add() RETURNS opaque AS '
DECLARE

r record; -- record
en_key int;   -- holds english key value
default_text text;-- holds english text value
iso_en text := ''en'';-- holds value of en


BEGIN
-- Sets english language key
SELECT INTO en_key
ml_key
FROM languages
WHERE iso_id = iso_en;

FOR r in SELECT * from multi_language LOOP

-- Sets default_text to english text value for record
SELECT INTO default_text
lang_code_and_text[en_key][2]
FROM multi_language
WHERE id = r.id;

RAISE NOTICE ''Current record id is %.'', r.id;
RAISE NOTICE ''Default english text is %.'', 
default_text;

-- Appends new language arrray to exisiting 
multidimensional array
-- New language array of form {''fr'',''Default english 
text here''}

UPDATE multi_language
			SET lang_code_and_text = r.lang_code_and_text || ARRAY[new.iso_id, 
default_text]

WHERE id = r.id;

END LOOP;
END;
' LANGUAGE 'plpgsql';

---(end of broadcast)---
TIP 5: Have you checked our extensive FAQ?

  http://www.postgresql.org/docs/faq


Re: [GENERAL] Transparent i18n?

2005-07-04 Thread David Pratt

Many thanks, Karsten.  I am going to look at your example closely.

Regards
David

On Sunday, July 3, 2005, at 09:50 AM, Karsten Hilbert wrote:


On Sat, Jul 02, 2005 at 05:00:50PM -0300, David Pratt wrote:

http://savannah.gnu.org/cgi-bin/viewcvs/gnumed/gnumed/gnumed/server/ 
sql/gmI18N.sql?rev=1.20&content-type=text/vnd.viewcvs-markup

Many thanks Karsten for some insight into how you are handling this.

David,

if you go to the Developers Corner in our Wiki at

 http://salaam.homeunix.com/twiki/bin/view/Gnumed/WebHome

you'll find an explanation of how we use this. Feel free to
ask for comments if that doesn't suffice.

(I am offline so can't give the precise URL.)

Karsten
--
GPG key ID E4071346 @ wwwkeys.pgp.net
E167 67FD A291 2BEA 73BD  4537 78B9 A9F9 E407 1346

---(end of  
broadcast)---

TIP 4: Don't 'kill -9' the postmaster



---(end of broadcast)---
TIP 3: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


Re: [GENERAL] Trigger help Solved

2005-07-04 Thread David Pratt

I figured out my trigger trouble:

			SET lang_code_and_text = r.lang_code_and_text || ARRAY[new.iso_id, 
default_text]


Above was not casting ARRAY[] as text[] so it would not concatenate 
with existing array -  so had to set a variable to cast the type and 
then concatenate it to original and also had to return NULL since this 
is an after trigger.


Regards,
David

---(end of broadcast)---
TIP 6: Have you searched our list archives?

  http://archives.postgresql.org


Re: [GENERAL] Transparent i18n?

2005-07-04 Thread David Pratt
Hi Greg. Not sure about this one since I have never made my own type.  
Do you mean like an ip to country type of situation to guess locale?  
If so, I am using a ip to country table to lookup ip from request and 
get the country so language can be passed automatically to display 
proper language (but I need some translation work done first before I 
can activate this).  I will also use this for black listing purposes 
and other things so multi purpose.


I have got a good part of what I wanted working so far.  I am just 
working on language update delete trigger since there does not appear 
to be a direct way of surgically removing a specific element from an 
array in postgres unless I have missed something.  For example if I 
knew spanish was 3rd array in my multi-dimensional array of say 10 
lang/translation arrays in the array containing all translations - to 
remove just this one without having rewrite the array and update the 
field (which is what I am hoping to complete today).


So my language update delete trigger needs to scan the array for 
lang/translation for deletion, update language key for each language 
from a reference field (other than for the language being deleted), 
rewrite the array without the lang/translation that was deleted, and 
then update the field with rewritten array.  Sounds worse that it 
really is since the multidimensional array containing each 
lang/translation array is same length and you are performing this by 
iterating with a loop through records in multi_language table. Further, 
each translation can be compared by key (for me this is the iso 
language code).  Also, realistically how many times do you need to add 
and drop languages.  And number of languages in use for me will likely 
never exceed say 20. So this process, even with large numbers of 
multi-language fields should not be that problematic even if you had 
say a few thousand text fields fields you wanted translations available 
for. I think you would still be looking at milliseconds to perform 
this. This will be an after type trigger (after deletion).  I guess I 
will see what performance is like when I am finished - so far it is 
pretty fast for adding.


You also have a sensible structure for multi_language fields where each 
one is referenced to multi_language table by id (normalized) with 
referential integrity (something I was seeking).  The only thing not 
normalized are translations which is okay to me since array structure 
is dynamic yet keys give you exactly what you want.  I am also going to 
look at Karsten's material shortly to see how his system works but I am 
interested in following through with what I started first with arrays 
approach since I am happy with what I am seeing.


Regards,
David

On Monday, July 4, 2005, at 12:06 PM, Greg Stark wrote:



I wonder if you could make an SQL type that used text[] as its storage 
format
but had an output function that displayed the correct text for the 
"current

locale". Where "current locale" could be something you set by calling a
function at the beginning of the transaction.

Do pg_dump and all the important things use the send/receive functions 
not the
input/output functions? so even though this output function loses 
information

it wouldn't cause serious problems?

You would still need a way to retrieve all the languages for the cases 
like
administrative interfaces for updating the information. I'm not 
entirely
convinced this would be any better than the alternative of retrieving 
all of
them by default and having a function to retrieve only the correct 
language.


--
greg


---(end of 
broadcast)---

TIP 8: explain analyze is your friend



---(end of broadcast)---
TIP 9: In versions below 8.0, the planner will ignore your desire to
  choose an index scan if your joining column's datatypes do not
  match


[GENERAL] Regex escape [ character and change text result into integer

2005-07-05 Thread David Pratt
Hi.  I am using array_dims to give me dimensions of  multidimensional 
array ie:


[1:5][1:2]

In my function I want to retreive the value of the second number from 
array_dims (5 in example above)


This is what I am trying:

count_str = substr(dimensions from '\[\d\:(\d+)\]\[\d\:\d\]')
(this should give me 5 in the above example)

I have two problems:

1) I can't seem to escape the [ character - so how do I do this?
2) How do I turn my result which is originally text into an integer?

Regards,
David

---(end of broadcast)---
TIP 4: Don't 'kill -9' the postmaster


Re: [GENERAL] Regex escape [ character and change text result into

2005-07-05 Thread David Pratt
Hi Tom.  I misread the manual.  I thought I could not do array_upper on 
multidimensional array but it was specific concatenation functions.  
Thank you for clarifying this.


Regards,
David

On Tuesday, July 5, 2005, at 01:22 PM, Tom Lane wrote:


David Pratt <[EMAIL PROTECTED]> writes:

Hi.  I am using array_dims to give me dimensions of  multidimensional
array ie:



[1:5][1:2]



In my function I want to retreive the value of the second number from
array_dims (5 in example above)


Why aren't you using array_upper()?


This is what I am trying:
count_str = substr(dimensions from '\[\d\:(\d+)\]\[\d\:\d\]')


Perhaps you forgot to double the backslashes?

regards, tom lane

---(end of 
broadcast)---

TIP 8: explain analyze is your friend



---(end of broadcast)---
TIP 3: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


Re: [GENERAL] Transparent i18n?

2005-07-07 Thread David Pratt
Many thanks Karsten. I got a system working with arrays yesterday but 
will still be examining your code. I guess the next challenge is to see 
how well the multidimensional array can be searched.  I guess I could 
make indexes on an expression to retrieve language for a specific key 
since each element array of multidimensional array is a translation 
that includes the iso code and text of the translation.


It is pretty light and quick.  I am open to examining anything that 
will help me learn more about doing this well.


Regards,
David.


On Wednesday, July 6, 2005, at 11:19 AM, Karsten Hilbert wrote:


On Mon, Jul 04, 2005 at 03:27:59PM -0300, David Pratt wrote:

I am also going to look at Karsten's material shortly to see how his 
system works

I am still away from the net but here is how to find the
description in our Wiki:

Go to user support, user guide, scroll down do developers
guide, go to backend I18N.

Please point out anything you find difficult to figure out.

Karsten
--
GPG key ID E4071346 @ wwwkeys.pgp.net
E167 67FD A291 2BEA 73BD  4537 78B9 A9F9 E407 1346

---(end of 
broadcast)---

TIP 8: explain analyze is your friend



---(end of broadcast)---
TIP 2: you can get off all lists at once with the unregister command
   (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])


[GENERAL] Update more than one table

2005-07-09 Thread David Pratt
Hi. I have a form that collects information from the user but then I 
need to update three separate tables from what the user has submitted. 
I could do this with application logic but I would feel it would be 
best handled in Postgres as a transaction.


I need to do things in this order to satisfy the foreign key 
constraints:


1.   Insert part of the data into 2 records of the first table (I need 
to return theses ids so available for the next insert).


2.   Insert part of the data into a record in a second table.  The id's 
created in 1. need to be part of this record (cannot be null values) 
and have also have referential integrity with the first table


3.   Insert the last part of the data into a record in a third table.  
The id created in 2 needs to be part of this record). This has 
referential integrity with the second table.


Can someone suggest the best way of handling this.  Triggers are out 
since each update requires different fields.  I am thinking the only 
way to do this is a function. So biggest question is how to return the 
ids created from the first update (so they can be used by the second) 
and then need the id generated from second update (so it can be used 
the third). The ids for each table are serial type so they each have a 
sequence associated with them.  Would you nest functions?  Create each 
separately and wrap them within one function? Even with this, I am 
thinking the most important part is how do I return the id from the 
record I just inserted in a table.


Many thanks
David


---(end of broadcast)---
TIP 9: In versions below 8.0, the planner will ignore your desire to
  choose an index scan if your joining column's datatypes do not
  match


Re: [GENERAL] Update more than one table

2005-07-10 Thread David Pratt
Hi Roman.  Many thanks for your reply.  This is interesting and will I 
give this a try and let you know how it works out. With this you are 
right, application logic and transaction don't have to be separate 
which would be nice for this.  I was thinking the only way to solve was 
a function that performed an update and returned the nextval at the 
same time so that I could use that value to perform the update on next 
table,etc.


Regards,
David

On Sunday, July 10, 2005, at 02:32 PM, Roman Neuhauser wrote:


# [EMAIL PROTECTED] / 2005-07-09 22:55:26 -0300:

Hi. I have a form that collects information from the user but then I
need to update three separate tables from what the user has submitted.
I could do this with application logic but I would feel it would be
best handled in Postgres as a transaction.


Those two don't conflict.


I need to do things in this order to satisfy the foreign key
constraints:

1.   Insert part of the data into 2 records of the first table (I need
to return theses ids so available for the next insert).

2.   Insert part of the data into a record in a second table.  The 
id's

created in 1. need to be part of this record (cannot be null values)
and have also have referential integrity with the first table

3.   Insert the last part of the data into a record in a third table.
The id created in 2 needs to be part of this record). This has
referential integrity with the second table.


metacode:

BEGIN;
INSERT INTO first_table ...;
SELECT currval(first_table);
INSERT INTO first_table ...;
SELECT currval(first_table);
INSERT INTO second_table ...;
INSERT INTO third_table (... currval(second_table));
COMMIT;

You can do this with any CLI, like libpq, the Perl DBI, PHP/PEAR
pgsql_* functions or DB...

--
How many Vietnam vets does it take to screw in a light bulb?
You don't know, man.  You don't KNOW.
Cause you weren't THERE. http://bash.org/?255991



---(end of broadcast)---
TIP 6: explain analyze is your friend


Re: [GENERAL] Update more than one table

2005-07-12 Thread David Pratt
Hi Bruno and Roman. I am attempting to implement your advice. Bruno, 
how do I make a foreign key deferable since this sounds like an 
interesting approach.


I have got another problem on top of the first. For the first two 
inserts I need to insert a multi-dimensional array into one of the 
fields of the table, and the order of the arrays within the larger 
array is important.  So I am in the process of making a function that 
will insert the record into first table, rewrite the array and return 
currval.  So problem I have run into is passing multi-dimensional array 
as a parameter for a function.


To do a basic test of passing an array into a function I did this:

CREATE FUNCTION create_record_test(text[][]) RETURNS int4 AS '

DECLARE

test_array ALIAS FOR $1;  -- alias for input array  

BEGIN

return array_upper(test_array,1)

END;
' LANGUAGE 'plpgsql';

SELECT create_record_test(ARRAY[ARRAY['A','test one'],ARRAY['B','test 
two']]) AS output;


but I am getting syntax errors and I tried a variety of ways to quote 
the SELECT string and can't seem to get it take the array as an input :(


Given the fact I will now have a function returning the currval for 
each insert (once I determine to pass array to function ), will the 
approaches suggested still work or should I create another function for 
doing the update for table 2 inserting currval each time as variable 
for select statement in the function and have function for insert in 
table 2 return currval as well?


Regards,
David






On Tuesday, July 12, 2005, at 12:08 PM, Bruno Wolff III wrote:


On Sun, Jul 10, 2005 at 15:05:30 -0300,
  David Pratt <[EMAIL PROTECTED]> wrote:

Hi Roman.  Many thanks for your reply.  This is interesting and will I
give this a try and let you know how it works out. With this you are
right, application logic and transaction don't have to be separate
which would be nice for this.  I was thinking the only way to solve 
was

a function that performed an update and returned the nextval at the
same time so that I could use that value to perform the update on next
table,etc.


Normally you can just use currval. But in your case you insert insert 
two
records and currval will only return the value of the second record's 
key.
Assuming the first record's key is one less than the second's is not a 
good

idea. With the current version you can probably make this work reliably
by grabbing a block of ids for your session and making sure that the 
two

records get their keys from the same preallocated block.

Another option that I think could work is to make the two foreign key 
checks

deferrable and insert the record for table 2 before the two records in
table 1. You can use nextval(pg_get_serial_sequence('table1', 
'table1key'))
twice in the insert. Then when inserting the two entries into table 1 
you
can use currval to get the key value for the record in table 2 and use 
the

appropiate column for each of the two records. As long as you aren't
depending on the ordering of the key values for the two records in 
table 1

you should be OK.

---(end of 
broadcast)---

TIP 1: if posting/reading through Usenet, please send an appropriate
   subscribe-nomail command to [EMAIL PROTECTED] so that 
your

   message can get through to the mailing list cleanly



---(end of broadcast)---
TIP 5: don't forget to increase your free space map settings


[GENERAL] Array as parameter for plpgsql function

2005-07-13 Thread David Pratt
How does one pass an array as a parameter to a plpgsql function?  I 
have tried this the following. I can't seem to get a select statement 
to work without syntax problems and no examples in Postgres book to 
help with this :(  This is just a test so please ignore the fact it is 
a simple function.


CREATE FUNCTION create_record_test(text[][]) RETURNS int4 AS '

DECLARE

test_array ALIAS FOR $1;  -- alias for input array  

BEGIN

return array_upper(test_array,1)

END;
' LANGUAGE 'plpgsql';

SELECT create_record_test(ARRAY[ARRAY['A','test one'],ARRAY['B','test 
two']]) AS output;


---(end of broadcast)---
TIP 5: don't forget to increase your free space map settings


Re: [GENERAL] Array as parameter for plpgsql function

2005-07-13 Thread David Pratt
argh!!!  It was telling me I had an error in select statement. Thanks 
Tom!


Regards
David

On Wednesday, July 13, 2005, at 11:08 AM, Tom Lane wrote:


David Pratt <[EMAIL PROTECTED]> writes:

CREATE FUNCTION create_record_test(text[][]) RETURNS int4 AS '
DECLARE
test_array ALIAS FOR $1;  -- alias for input array  
BEGIN
return array_upper(test_array,1)
END;
' LANGUAGE 'plpgsql';



SELECT create_record_test(ARRAY[ARRAY['A','test one'],ARRAY['B','test
two']]) AS output;


Works fine for me, once I add the semicolon you forgot:

return array_upper(test_array,1);

regards, tom lane

---(end of 
broadcast)---

TIP 1: if posting/reading through Usenet, please send an appropriate
   subscribe-nomail command to [EMAIL PROTECTED] so that 
your

   message can get through to the mailing list cleanly



---(end of broadcast)---
TIP 1: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


[GENERAL] Postgres as embedded db for GUI

2005-11-24 Thread David Pratt
Hi.  Where would I locate information on setting up postgres as a stand 
alone db for a gui application.  I am writing this in python and have 
been using sqlite but would like to try my app with postgres backend.  
Links to any specific open source example would be helpful. The only 
user would be the application as opposed to using postgres as server 
for multiple client applications.


Many thanks
David

---(end of broadcast)---
TIP 1: if posting/reading through Usenet, please send an appropriate
  subscribe-nomail command to [EMAIL PROTECTED] so that your
  message can get through to the mailing list cleanly


Re: [GENERAL] Postgres as embedded db for GUI

2005-11-24 Thread David Pratt
Hi Jaime.  I currently work with psycopg with web apps. My question is 
more a matter of whether postgres can be installed in a way that keeps 
it contained to the program files for the app (as opposed to the 
regular type of install where postgres becomes available as a server 
available anywhere on the operating system.  I don't want the 
installation used for tables other than app tables (or available to 
anything other than the app).


I am not sure of whether postgres can be set up this way which is the 
reason for the question.  I have only ever installed it on FreeBSD and 
MacOSX in the normal way and then compiling the db adapter to work with 
it.


Regards,
David

On Thursday, November 24, 2005, at 12:06 PM, Jaime Casanova wrote:


On 11/24/05, David Pratt <[EMAIL PROTECTED]> wrote:
Hi.  Where would I locate information on setting up postgres as a 
stand

alone db for a gui application.  I am writing this in python and have
been using sqlite but would like to try my app with postgres backend.
Links to any specific open source example would be helpful. The only
user would be the application as opposed to using postgres as server
for multiple client applications.

Many thanks
David




google for psycopg2

--
Atentamente,
Jaime Casanova
(DBA: DataBase Aniquilator ;)

---(end of 
broadcast)---

TIP 1: if posting/reading through Usenet, please send an appropriate
   subscribe-nomail command to [EMAIL PROTECTED] so that 
your

   message can get through to the mailing list cleanly



---(end of broadcast)---
TIP 2: Don't 'kill -9' the postmaster