On Sat, Oct 30, 2010 at 9:30 PM, Arturas Mazeika <maze...@gmail.com> wrote: > Thanks for the info, this explains a lot. > > Yes, I am upgrading from the 32bit version to the 64bit one. > > We have pretty large databases (some over 1 trillion of rows, and some > containing large documents in blobs.) Giving a bit more memory than 4GB > limit to Postgres was what we were long longing for. Postgres was able to > handle large datasets (I suppose it uses something like long long (64bit) > data type in C++) and I hoped naively that Postgres would be able to migrate > from one version to the other without too much trouble. > > I tried to pg_dump one of the DBs with large documents. I failed with out of > memory error. I suppose it is rather hard to migrate in my case :-( Any > suggestions?
Yikes, that's not good. How many tables do you have in your database? How many large objects? Any chance you can coax a stack trace out of pg_dump? -- Robert Haas EnterpriseDB: http://www.enterprisedb.com The Enterprise PostgreSQL Company -- Sent via pgsql-bugs mailing list (pgsql-bugs@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-bugs