I am using COPY to bulk load large volumes (i.e. multi GB range) of data to a
staging table in a PostgreSQL 8.3. For performance, the staging table has no
constraints, no primary key, etc. I want to move that data into the "real"
tables, but need some advice on how to do that efficiently.
Here
I have an app that was previously using a large unpartitioned table with no
problems. I partitioned this table and am now experiencing intermittent hangs
when inserting data into the partitioned table. The stored procedure that does
the insert seems to run to completion even when it 'hangs'. The
A re-post, since I'm really stuck on this and could use some advice on how to
troubleshoot this...
I have an app that was previously using a large unpartitioned table with no
problems. I partitioned this table and am now experiencing intermittent hangs
when inserting data into the partitioned t
--- On Mon, 12/7/09, Tom Lane Have you looked into pg_locks to see if it's blocked
> waiting for a lock?
> The TRUNCATE in particular would require exclusive lock on
> the table, so it could be waiting for some other process
> that's touched the table.
Thanks Tom - while pg_locks did not reveal
Can anyone point me towards good articles or books that would help a PostgreSQL
novice (i.e. me) learn the optimal approaches to setting up a DB for analytics?
In this particular case, I need to efficiently analyze approximately 300
million system log events (i.e. time series data). It's log da