More information about my setup:
Postgres version:
PostgreSQL 10.9 (Debian 10.9-1.pgdg90+1) on x86_64-pc-linux-gnu, compiled
by gcc (Debian 6.3.0-18+deb9u1) 6.3.0 20170516, 64-bit
Table schema:
CREATE TABLE public.projects (
misc jsonb DEFAULT '{}'::jsonb NOT NULL
);
Explain analyze:
explain analyze update projects set misc = misc - 'foo';
Update on projects (cost=0.00..4240.93 rows=10314 width=1149) (actual
time=346318.291..346318.295 rows=0 loops=1)
-> Seq Scan on projects (cost=0.00..4240.93 rows=10314 width=1149)
(actual time=1.011..266.435 rows=10314 loops=1)
Planning time: 40.087 ms
Trigger trigger_populate_tsv_body_on_projects: time=341202.492 calls=10314
Execution time: 346320.260 ms
Time: 345969.035 ms (05:45.969)
Figured out that it's due to the trigger. Thanks for your help, Adrian!
On Wed, Jul 17, 2019 at 10:39 AM Adrian Klaver
wrote:
> On 7/17/19 7:30 AM, Volkan Unsal wrote:
> > I'm trying to remove a key from a jsonb column in a table with 10K rows,
> > and the performance is abysmal. When the key is missing, it takes 5
> > minutes. When the key is present, it takes even longer.
> >
> > Test with non-existent key:
> >
> > >> update projects set misc = misc - 'foo';
> > Time: 324711.960 ms (05:24.712)
> >
> > What can I do to improve this?
>
> Provide some useful information:
>
> 1) Postgres version
>
> 2) Table schema
>
> 3) Explain analyze of query
>
>
>
> --
> Adrian Klaver
> adrian.kla...@aklaver.com
>
--
*Volkan Unsal*
*Product Engineer*
volkanunsal.com