Thanks a lot.. That is what i searched.. In fact your query is very good for
little changes, but i will have to use another method when updating all my rows
because the performance is not very good alas.
My data set contains something like 40000 rows to update in 1+ million records
and data_raw, data_sys are of type "real"... The complete update took 40
minutes on a 256Mo, athlon 2400, kernel 2.6 and with no charge during the
execution of the query.
Is this normal ? The number of columns of the table does it matter a lot (the
table contains 12 reals and 4 integers) ?
I found that using an intermediate table which stock for every row the value
before and the value after helps to gain speed... But it is not a very nice way
i think..
Thanks again :)
Etienne
---------------------------(end of broadcast)---------------------------
TIP 5: Have you checked our extensive FAQ?
http://www.postgresql.org/docs/faq