On 02/04/2014 01:52 PM, AlexK wrote:
Every row of my table has a double[] array of approximately 30K numbers. I
have ran a few tests, and so far everything looks good.
I am not pushing the limits here, right? It should be perfectly fine to
store arrays of 30k double numbers, correct?
--
View this message in context:
http://postgresql.1045698.n5.nabble.com/Is-it-reasonable-to-store-double-arrays-of-30K-elements-tp5790562.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.
What sorts of tests and what sorts of results?
Each record has something like 30000*16 + 30000*(per cell overhead,
which could be zero) but that is definitely spilling over to toast.
Have you done any large scale deletes?