What's the simplest way to grant read-only access to an existing
database? One approach I guess would be to create a user who has
SELECT but not INSERT etc privileges. But it appears that GRANT SELECT
does not work at the schema or database level. This means I'd not only
have to create hundreds of
Bruno Wolff III wrote:
I think deferred triggers can also use a lot of memory.
I do indeed have several columns with REFERENCES x DEFERRABLE INITIALLY
DEFERRED...
Next time I run the procedure, I will try dropping the foreign key
constraints first.
Incidently, would be nice if Postgres had some
Stephan Szabo wrote:
Explain output would also be useful. I would wonder if it's a problem
with a hash that misestimated the necessary size; you might see if
analyzing the tables involved changes its behavior.
I executed ANALYZE just before running the problematic statement. Will
post the output
I'm trying to fill a table with several million rows that are obtained
directly from a complex query.
For whatever reason, Postgres at one point starts using several
gigabytes of memory, which eventually slows down the system until it no
longer responds.
At first I assumed I had unintentionall
Using an OR or IN query seems to be orders of magnitudes slower than
running a query twice. There is an unique index on 'id' and an index on
'model_ns, model'. The number of row returned is less than 800.
Everything is vacuumed and analyzed. Running on 7.4.1. Perhaps this
situation is something the
return 0 if ($_[0] =~ /\\b$_/i);
}
else
{
my @matches = ();
@matches = $_[0] =~ /\\b$_/gi;
return 0 unless scalar @matches;
$score += scalar @matches or return 0;
}
}
return $score;
'
LANGUAGE 'plperl';
--
Eric Jain
XT)) AS score FROM articles
WHERE score_a(text, CAST('term' AS TEXT)) > 0
ORDER BY score DESC;
Doesn't seem efficient to me? Or are the results from score_a cached
somehow?
score_a is a (rather computation-intensive :-) PL/Perl function which
returns an integer.
I am using PostgreSQL 7.0
--
Eric Jain