Thanks Allan for the input - I guess I didn't specify enough details.  I am 
looking for some type of tool/report that is already done.  We have nearly 1000 
tables, over 300 functions to look at a little over a day to provide the 
answers (all without dropping any other tasks, of course).  I had considered 
the trigger idea, and may end of doing it anyway and just working later, but 
thought I would check for a "ready-made" solution first.

Andy

-----Original Message-----
From: Allan Kamau [mailto:kamaual...@gmail.com] 
Sent: Thursday, February 25, 2010 12:41 PM
To: Andy Yoder
Cc: pgsql-general@postgresql.org
Subject: Re: [GENERAL] Tool for determining field usage of database tables

Writing an audit trigger for the operations you'd like to monitor
(then assign it to all your application's tables) to perform the
auditing may be one easy way of doing so, this trigger would log the
operations to some other table.

Allan.

On Thu, Feb 25, 2010 at 7:36 PM, Andy Yoder <ayo...@airfacts.com> wrote:
> Does anyone know of a tool (or a way to use the database catalogs) that can
> analyze function code/queries accessing the database to pull out a list of
> the fields used in a set of tables.  Basically we are importing a lot of
> data from another source, and we are trying to determine what percentage of
> the data we are actually using at this point .  We have hundreds of stored
> procedures, and combing through the code would not be practical.
>
>
>
> Thanks.
>
>
>
> --Andy
>
>
>
>
>
>

-- 
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to