On Mar 21, 11:36 am, [EMAIL PROTECTED] ("Bill Eaton") wrote:
> I want to allow some queries for my users to run for a prescribed period of
> time and kill them if they go over time. Is there a good way to do this? Or
> is this a bad idea?
>
> I've been struggling with trying to figure out the best way to allow users
> to browse through large tables. For example, I have one table with about
> 600,000 rows and growing at about 100,000/month.
>
> I want to allow users to browse through this table, but only if their
> effective SELECT statement only generates 100 or maybe 1000 rows. There are
> several fields that can be used in the WHERE clause, such as user, date,
> model, etc. It will be difficult for me to predict how large a result set is
> a priori. So I want to allow the query to run for a prescribed period of
> time, then kill it.
>
> I'll probably be using ADO --> ODBC at the client. So I could probably kill
> the Connection/Recordset. I just don't know the best way to do it. pgAdmin
> allows queries to be killed. How does it do it?
>
> Thanks in advance,
>
> Bill Eaton
> Thousand Oaks, CA
>
> ---------------------------(end of broadcast)---------------------------
> TIP 4: Have you searched our list archives?
>
>                http://archives.postgresql.org/
You could use "limit" to set the max returned result set allowed when
you put together the query.

Travis


---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate
       subscribe-nomail command to [EMAIL PROTECTED] so that your
       message can get through to the mailing list cleanly

Reply via email to