Re: Same query 10000x More Time

2022-01-06 Thread Vijaykumar Jain
On Thu, 6 Jan 2022 at 20:01, Avi Weinberg wrote: > Thanks for the input > > > > postgres_fdw seems to bring the entire table even if all I use in the join > is just the id from the remote table. I know it is possible to query for > the missing ids and then perform the delete, but I wonder why al

RE: Same query 10000x More Time

2022-01-06 Thread Avi Weinberg
Thanks for the input postgres_fdw seems to bring the entire table even if all I use in the join is just the id from the remote table. I know it is possible to query for the missing ids and then perform the delete, but I wonder why all types of joins are so inefficient. DELETE FROM tbl_loc

Re: Same query 10000x More Time

2022-01-06 Thread Vijaykumar Jain
On Thu, Jan 6, 2022, 3:50 PM Avi Weinberg wrote: > Hi Kyotaro Horiguchi and Vijaykumar Jain, > > Thanks for your quick reply! > > I understand that the fact the slow query has a join caused this problem. > However, why can't Postgres evaluate the table of the "IN" clause (select > 140 as id union

RE: Same query 10000x More Time

2022-01-06 Thread Avi Weinberg
Hi Kyotaro Horiguchi and Vijaykumar Jain, Thanks for your quick reply! I understand that the fact the slow query has a join caused this problem. However, why can't Postgres evaluate the table of the "IN" clause (select 140 as id union select 144 union select 148) and based on its size decide

Re: Same query 10000x More Time

2022-01-06 Thread Kyotaro Horiguchi
At Thu, 6 Jan 2022 13:50:55 +0530, Vijaykumar Jain wrote in > On Thu, 6 Jan 2022 at 13:13, Avi Weinberg wrote: > > > Hi > > > > > > > > I have postgres_fdw table called tbl_link. The source table is 2.5 GB in > > size with 122 lines (some lines has 70MB bytea column, but not the ones I > > se

Re: Same query 10000x More Time

2022-01-06 Thread Vijaykumar Jain
On Thu, 6 Jan 2022 at 13:13, Avi Weinberg wrote: > Hi > > > > I have postgres_fdw table called tbl_link. The source table is 2.5 GB in > size with 122 lines (some lines has 70MB bytea column, but not the ones I > select in the example) > > I noticed that when I put the specific ids in the list "