Daniele Varrazzo <[EMAIL PROTECTED]> writes:
> In my problem I had 2 tables: a small one (accounts), a large one (foo). The
> way the query is written doesn't allow the stats from the large table to be
> used at all, unless the records from the small table are fetched. This is
> independent from
Francisco Reyes writes:
Daniele Varrazzo writes:
I suspect the foo.account_id statistical data are not used at all in
query: the query planner can only estimate the number of accounts to
look for, not
You mentioned you bumped your default_statistics_target.
What did you increase it to?
My d
pgbench is unrelated to the workload you are concerned with if ETL/ELT and
decision support / data warehousing queries are your target.
Also - placing the xlog on dedicated disks is mostly irrelevant to data
warehouse / decision support work or ELT. If you need to maximize loading
speed while
I'm trying to run a few basic tests to see what a current machine can
deliver (typical workload ETL like, long running aggregate queries,
medium size db ~100 to 200GB).
I'm currently checking the system (dd, bonnie++) to see if performances
are within the normal range but I'm having trouble