Hi. I've got an app that queries pg_catalog to find any table columns that
have comments. After setting up PgBadger, it was #2 on my list of time
consuming queries, with min/max/avg duration of 199/2351/385 ms (across
~12,000 executions logged).
I'm wondering if there are any ways to speed this
Another suggestion, try to cluster the table using the index for the
"field" column, then analyze. If you're on a spinning disk it will help if
you sort your search "field" during bulk insert.
--
regards
marie g. bacuno ii
On Fri, Jul 19, 2019 at 11:39 AM Tomas Vondra
wrote:
> On Fri, Jul 19,
On Fri, Jul 19, 2019 at 07:43:26PM +0530, mayank rupareliya wrote:
Well, you haven't shown us the execution plan, so it's hard to check why
it did not help much and give you further advice.
This is the latest query execution with explain after adding indexing on
both columns.
Aggregate (cost=
On Fri, Jul 19, 2019 at 8:13 AM mayank rupareliya
wrote:
> Well, you haven't shown us the execution plan, so it's hard to check why
> it did not help much and give you further advice.
>
>
> This is the latest query execution with explain after adding indexing on
> both columns.
>
> Aggregate (co
Well, you haven't shown us the execution plan, so it's hard to check why
it did not help much and give you further advice.
This is the latest query execution with explain after adding indexing on
both columns.
Aggregate (cost=174173.57..174173.58 rows=1 width=8) (actual
time=65087.657..65087.65