Hi Julian,
Here's my complete python script:
import psycopg2
import random
import math
import time
connection = psycopg2.connect('host=localhost dbname=pgtest user=pgtest
password=pgtest')
cursor = connection.cursor()
while True:
id = random.randrange(1, 1000 * 1000)
cursor.execute('sel
Hi Bart,
You are doing heavy random reads in addition to a 1000k row insert
within a single transaction.
Also it is not clear whether your query within the python loop is itself
within a transaction. (i.e. pyscopg2.connection.autocommit to True,
disables transactional queries).
Depending on your
Hi all
We're experiencing a very strange performance issue. Our setup is a bit
more complicated, but we've managed to isolate and replicate the core
problem. Here's what we observe:
We took a strong machine (128 GB RAM, 8-core CPU, SSD drives...) and
installed a fresh copy of PostgreSQL 9.2 (Ubun
Test:
1. create a table with a range type column.
2. insert 1000 identical values into that column.
3. analyze
4. n-distinct will still be listed as -1 (unique) for the column.
Why?
--
Josh Berkus
PostgreSQL Experts Inc.
http://pgexperts.com
--
Sent via pgsql-performance mailing list (pgsql-