2014-02-15 21:52 GMT+01:00 AlexK :
> Hi Pavel,
>
> 1. I believe we have lots of memory. How much is needed to read one array
> of 30K float number?
>
it is not too much - about 120KB
> 2. What do we need to avoid possible repeated detost, and what it is?
>
any access to array emits detoast - s
Hi Pavel,
1. I believe we have lots of memory. How much is needed to read one array
of 30K float number?
2. What do we need to avoid possible repeated detost, and what it is?
3. We are not going to update individual elements of the arrays. We might
occasionally replace the whole thing. When we ben
Hello
I worked with 80K float fields without any problem.
There are possible issues:
* needs lot of memory for detoast - it can be problem with more parallel
queries
* there is a risk of possible repeated detost - some unhappy usage in
plpgsql can be slow - it is solvable, but you have to ident
Would 10K elements of float[3] make any difference in terms of read/write
performance?
Or 240K byte array?
Or are these all functionally the same issue for the server? If so,
intriguing possibilities abound. :)
--
View this message in context:
http://postgresql.1045698.n5.nabble.com/Is-it-re
I will be always reading/writing the whole array. The table is about 40GB. It
replaces two tables, parent and child, using about 160 GB together.
--
View this message in context:
http://postgresql.1045698.n5.nabble.com/Is-it-reasonable-to-store-double-arrays-of-30K-elements-tp5790562p5790570.ht
On Tue, Feb 4, 2014 at 2:59 PM, Rob Sargent wrote:
> On 02/04/2014 01:52 PM, AlexK wrote:
>
> Every row of my table has a double[] array of approximately 30K numbers. I
> have ran a few tests, and so far everything looks good.
>
> I am not pushing the limits here, right? It should be perfectly fin
No large deletes, just inserts/updates/selects. What are the potential
problems with deletes?
--
View this message in context:
http://postgresql.1045698.n5.nabble.com/Is-it-reasonable-to-store-double-arrays-of-30K-elements-tp5790562p5790568.html
Sent from the PostgreSQL - general mailing list a
On 02/04/2014 01:52 PM, AlexK wrote:
Every row of my table has a double[] array of approximately 30K numbers. I
have ran a few tests, and so far everything looks good.
I am not pushing the limits here, right? It should be perfectly fine to
store arrays of 30k double numbers, correct?
--
View
Every row of my table has a double[] array of approximately 30K numbers. I
have ran a few tests, and so far everything looks good.
I am not pushing the limits here, right? It should be perfectly fine to
store arrays of 30k double numbers, correct?
--
View this message in context:
http://postgr