On 2/22/21 9:00 AM, Oleksandr Voytsekhovskyy wrote:
Greetings,
We have queries with IN filters with long list of INT values
Sometimes, they running extremely slow, and I have found suggestion to use
syntax
Field IN (VALUES(1465), (1478), ...
Instead of
Field IN (1465, 1478, ...
On some cases it helps, but on other it makes query running 1000+ times slower
Insert the values into a TEMPORARY TABLE, then join that to your main table?
So the issue is:
- some queries much faster with simple IN
- some queries much faster with IN + VALUES
Here is explain on case when it’s extremely slow:
-> HashAggregate (cost=5.78..9.62 rows=385 width=4)
Group Key: ""*VALUES*"".column1"
-> Values Scan on ""*VALUES*"" (cost=0.00..4.81 rows=385
width=4)"
What is the right way to pass long INT values list to IN filter?
I am using PostgreSQL 13.1 on Ubuntu
--
Angular momentum makes the world go 'round.