In article <[EMAIL PROTECTED]>, "Unknown"
<[EMAIL PROTECTED]> wrote:
> I have an idea that might help I found ODBC to be very slow for
> importing data So I wrote a program in C that reads in dump files of SQL
> text on the Linux server itself E.G. first line is a create table, next
> lines are a
In article <[EMAIL PROTECTED]>,
"Creager, Robert S" <[EMAIL PROTECTED]> wrote:
> I think this is a question regarding the backend, but...
[snip]
> (COPY u FROM stdin). The backend process which handles the db connection
> decides that it needs a whole lot of memory, although in a nice
> control
Yes, I was loading a large table. :-)
The filesystem with pg_xlog filled up, and the
backend (all backends) died abnormally. I can't
restart postmaster, either.
Is it OK to delete the files from pg_xlog? What
will be the result?
Will I be able to avoid this problem by splitting
the load data
Yes, I was loading a large table. :-)
The filesystem with pg_xlog filled up, and the
backend (all backends) died abnormally. I can't
restart postmaster, either.
There are no stray IPC resources left allocated.
Is it OK to delete the files from pg_xlog? What
will be the result?
Will I be abl
In article
<[EMAIL PROTECTED]>,
"Mikheev, Vadim" <[EMAIL PROTECTED]> wrote:
>> Is it OK to delete the files from pg_xlog? What will be the result?
> It's not Ok. Though you could remove files numbered from 000
> to 00012 (in hex), if any.
OK, thanks. Is there any documentat
In article <[EMAIL PROTECTED]>, "Joseph"
<[EMAIL PROTECTED]> wrote:
> I am switching from rpm install of postgres to the compiled version. I
> have this running fine, but now my php4 quit working.
>
> So I am trying to compile it and have the error that it cannot find
> postgres.h
>
> Does it n
Hello all,
I'm running 7.1RC2 and have a question/problem:
I have a table which is 28150 pages in size. It has two indices
of 8001 and 9750 pages.
The filesystem on which pg_xlog resides has ~750MB free.
No other PostgreSQL work is running.
Yet, when running VACUUM ANALYZE on this table, I r