Hi. I was running a postgres script that made modifications to a database
with a column tagged as UNIQUE, and in the middle ran "vacuumdb --analyze"
from another console. "vacuumdb" reported an error along the lines of "duplicate
key" (sorry, I did not save the exact error!). After that, all attemps
to vacuum reported the same error; and I could not re-create the index
for the unique column because I would get the same "duplicate key" message.
The postgres version: wms=> SELECT version();I'm running Red Hat 7.0, using the "postgresql-7.0.2-17" package. The script that was running is very simple, but very strange. It looked like this: CREATE TABLE user_names (As you can see, this sets up a table, and then does many inserts and updates, each in their own transaction. Most of the inserts fail because they "lc_name" field will conflict with a row already in existance, but all of the updates succeed and update exactly one row (although in all but a few cases the update has no effect, because it is assigning the value to the column that it already has). I tried to repeat this problem, but couldn't do it...it only happened once. I realize that this is an old version of Postgres, but I read an email recently where one of the Postgresql developers was requesting examples of corrupted index files in version 7.0 or later, so I decided to send it in anyway. Sorry, but at the time I didn't think and just dropped the table and re-started...now I wish I'd saved the exact error messages and preserved the database too! Thanks for the great database!
|