Στις 16/10/24 22:55, ο/η Ron Johnson έγραψε:
On Wed, Oct 16, 2024 at 3:37 PM Andy Hartman <hartman60h...@gmail.com> wrote:

    I am very new to Postgres and have always worked in the mssql
    world. I'm looking for suggestions on DB backups. I currently have
    a DB used to store Historical information that has images it's
    currently around 100gig.

    I'm looking to take a monthly backup as I archive a month of data
    at a time. I am looking for it to be compressed and have a machine
    that has multiple cpu's and ample memory.

    Suggestions on things I can try ?
    I did a pg_dump using these parms
    --format=t --blobs lobarch

    it ran my device out of storage:

    pg_dump: error: could not write to output file: No space left on
    device

    I have 150gig free on my backup drive... can obviously add more

    looking for the quickest and smallest backup file output...

    Thanks again for help\suggestions


Step 1: redesign your DB to *NOT* use large objects.  It's an old, slow and unmaintained data type.  The data type is what you should use.
You mean bytea I guess. As a side note, (not a fan of LOs), I had the impression that certain drivers such as the JDBC support streaming for LOs but not for bytea? It's been a while I haven't hit the docs tho.

Step 2: show us the "before" df output, the whole pg_dump command, and the "after" df output when it fails. "du -c --max-depth=0 $PGDATA/base" also very useful.

And tell us what version you're using.

--
Death to <Redacted>, and butter sauce.
Don't boil me, I'm still alive.
<Redacted> crustacean!

Reply via email to