Dave,
Ah
> -> Seq Scan on sales_tickets s (cost=0.00..89543.50
> rows=626783 width=12) (actual time=38.017..19387.447 rows=713846
> loops=1)
This is just more than 1/2 the time of your query. The issue is that you're
pulling 713,000 rows (PG estimates 626,000 which is in the r
'explain analyze' output is below. I have done analyze recently, and
am using pg 7.4.2 on SuSE 9.1. I'd be curious to know how to "a
nestloop indexscan" to try it out.
Thanks,
Dave
mydb=> explain analyze select * from line_items t, sales_tickets s
where writtenDate >= '12/01/2002' and writtenDa
David Jaquay <[EMAIL PROTECTED]> writes:
> Summary: Doing a two or three table join for a date range performs
> worse than doing the same query individually for each date in the
> range.
Could we see EXPLAIN ANALYZE, not just EXPLAIN, results?
Also, have you ANALYZEd lately? If the estimated row
David,
> The problem: Doing a query for a date range on a particular store or
> market, though, for a date range of more than a few days does a
> sequential scan of sales_tickets, and performs worse than doing one
> single date query for each date. My 'explain' for one such query is
> below.
Can
Summary: Doing a two or three table join for a date range performs
worse than doing the same query individually for each date in the
range.
What works: Doing a query just on a single date or a date range
(against just one table) runs quick; 'explain' says it uses an index
scan. Doing a query on a