Hello dear List,
I'm currently wondering about how to streamline the normalization of a new
table.
I often have to import messy CSV files into the database, and making clean
normalized version of these takes me a lot of time (think dozens of columns
and millions of rows).
I wrote some code to aut
wow,
it was right under my nose.
Thank you very much !
Cheers,
Remi-C
Le mar. 13 nov. 2018 à 19:00, Tom Lane a écrit :
> =?UTF-8?Q?R=C3=A9mi_Cura?= writes:
> > So the pgpointcloud store sometimes very large groups of points into one
> > row (TOASTED), something along few kB to few MB. TOAST wou
Hi dear list,
I have a tricky question about TOASTED memory in Postgres related to the
[pgpointcloud](https://github.com/pgpointcloud/pointcloud) extension.
(using Postgres 11 if it matters)
So the pgpointcloud store sometimes very large groups of points into one
row (TOASTED), something along few