On Wednesday 27 August 2003 13:50, Tarhon-Onu Victor wrote:
>
> shared_buffers = 520
> max_locks_per_transaction = 128
> wal_buffers = 8
> max_fsm_relations = 3
> max_fsm_pages = 482000
> sort_mem = 131072
> vacuum_mem = 131072
> effective_cache_size = 1
> random_page_cost = 2
Slightly off
> Of course, I checked the error status for every insert, there is
> no error. It seems like in my case the postgres server cannot handle so
> much inserts per second some of the lines are not being parsed and data
> inserted into the database.
That sounds extremely unlikely. Postgres is not one
On 27 Aug 2003 at 15:50, Tarhon-Onu Victor wrote:
>
> Hi,
>
> I have a (big) problem with postgresql when making lots of
> inserts per second. I have a tool that is generating an output of ~2500
> lines per seconds. I write a script in PERL that opens a pipe to that
> tool, reads
On Wed, 27 Aug 2003, Bruno Wolff III wrote:
> Did you check the error status for the records that weren't entered?
>
> My first guess is that you have some bad data you are trying to insert.
Of course, I checked the error status for every insert, there is
no error. It seems like in my c
> > The problems is that only ~15% of the lines are inserted into
> > the database. The same script modified to insert the same data in a
> > similar table created in a MySQL database inserts 100%.
>
> Did you check the error status for the records that weren't entered?
>
> My first guess is that y
On Wed, Aug 27, 2003 at 15:50:32 +0300,
Tarhon-Onu Victor <[EMAIL PROTECTED]> wrote:
>
> The problems is that only ~15% of the lines are inserted into
> the database. The same script modified to insert the same data in a
> similar table created in a MySQL database inserts 100%.
Did you
Hi,
I have a (big) problem with postgresql when making lots of
inserts per second. I have a tool that is generating an output of ~2500
lines per seconds. I write a script in PERL that opens a pipe to that
tool, reads every line and inserts data.
I tryed both commited an