On 04/15/2018 05:05 PM, Christophe Pettus wrote:
>> On Apr 15, 2018, at 12:16, David Arnold <dar@xoe.solutions> wrote:
>>
>> Core-Problem: "Multi line logs are unnecessarily inconvenient to parse and 
>> are not compatible with the design of some (commonly used) logging 
>> aggregation flows."
> I'd argue that the first line of attack on that should be to explain to those 
> consumers of logs that they are making some unwarranted assumptions about the 
> kind of inputs they'll be seeing.  PostgreSQL's CSV log formats are not a 
> particular bizarre format, or very difficult to parse.  The standard Python 
> CSV library handles them just file, for example.  You have to handle newlines 
> that are part of a log message somehow; a newline in a PostgreSQL query, for 
> example, needs to be emitted to the logs.
>


In JSON newlines would have to be escaped, since literal newlines are
not legal in JSON strings. Postgres' own CSV parser has no difficulty at
all in handling newlines embedded in the fields of CSV logs.

I'm not necessarily opposed to providing for JSON logs, but the overhead
of named keys could get substantial. Abbreviated keys might help, but
generally I think I would want to put such logs on a compressed ZFS
drive or some such.

cheers

andrew

-- 
Andrew Dunstan                https://www.2ndQuadrant.com
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services


Reply via email to