On Wed, Dec 22, 2010 at 12:54:06PM -0300, Alvaro Herrera wrote: > Excerpts from David Fetter's message of miƩ dic 22 12:36:10 -0300 2010: > > On Wed, Dec 22, 2010 at 03:00:16PM +0000, Simon Riggs wrote: > > > On Wed, 2010-12-22 at 09:03 -0500, Andrew Dunstan wrote: > > > > > Quite apart from other reasons, such as possible ephemerality of > > > > the data, the difficulty of taking a reasonable random sample from > > > > an arbitrary foreign data source seems substantial, and we surely > > > > don't want ANALYSE to have to run a full sequential scan of a > > > > foreign data source. > > > > > > I think we need something that estimates the size of a table, at > > > least, otherwise queries will be completely un-optimised. > > > > The estimated size for a lot of things--streams of data, for > > example--is infinity. I suppose that's a good default for some cases. > > Since we don't have streaming queries,
We don't, but other systems do. > this would seem rather pointless ... Surely the FDW must be able to > limit the resultset somehow. Using LIMIT. :) Cheers, David. -- David Fetter <da...@fetter.org> http://fetter.org/ Phone: +1 415 235 3778 AIM: dfetter666 Yahoo!: dfetter Skype: davidfetter XMPP: david.fet...@gmail.com iCal: webcal://www.tripit.com/feed/ical/people/david74/tripit.ics Remember to vote! Consider donating to Postgres: http://www.postgresql.org/about/donate -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers