I have just done the first iteration of a perl script that will take a log file (not syslog yet) and parse it into timestamp, pid, dbname, keyword and details (timestamp, pid and dbname optional), accumulating continuation lines. It will then either write out split files based on dbname, or load the data to a table, or both. It's far from bulletproof, and totally uncommented, but it does work (for me :-), and demonstrates what can be done with what I have referred to as out of band processing.


Not sure what I should do with it - let people play, continue working to productise it and post it to patches (where would it belong?), give it to the "cookbook"?.

Doing this does make one aware what a pity it occasionally is that cross-database queries are not supported. I guess the best way to handle it would be to have a db just for logs thus:

create table logbase as (ts timestamp, db text, pid int, key text, details text);
create view foolog as select ts, pid, key, details from logbase where dbname = 'foo';
grant select on foolog to foodba;


and adjust your pg_hba.conf appropriately.

cheers

andrew


---------------------------(end of broadcast)--------------------------- TIP 4: Don't 'kill -9' the postmaster

Reply via email to