On 11/21/2010 12:31 AM, Terry Laurenzo wrote:

I copied the 5 sample documents from json.org <http://json.org>'s example section for these tests. These are loaded into a table with a varchar column 1000 times each (so the test table has 5000 rows in it). In all situations, the binary encoding was smaller than the normalized text form (between 9 and 23% smaller). I think there are cases where the binary form will be larger than the corresponding text form, but I don't think they would be very common.


Is that a pre-toast or post-toast comparison?

Even if it's post-toast, that doesn't seem like enough of a saving to convince me that simply storing as text, just as we do for XML, isn't a sensible way to go, especially when the cost of reproducing the text for delivery to clients (including, say, pg_dump) is likely to be quite high.

cheers

andrew

Reply via email to