Sorry for CC to mailing list, I forgot!!
No problem, I do sometimes. Then on other lists, I do it when I shouldn't.
The standard backup tool for PostgreSQL is pg_dump - this indeed just backs up the data. You should not rely on copying the files of
a running database - that won't always work.
But I stop the postgres service before to copy all the files, then the database is not running.
If the server is stopped, then yes, it's perfectly safe to backup the data directory, but make sure you do the whole directory including the xlog/clog stuff.
> The problem is that I use a backup tool
that backup data from my server (MySql and Postgres databases included) in that way (I know pg_dump, I used it before) and afterwards it transfers the backup file (a kind of tar file) via FTP on another machine. Of course I can schedule it. My idea was to increment the protection of my db data.
Well, it sounds like you want point-in-time recovery (PITR). This is being worked on at the moment, and would allow you to take a full backup nightly and incremental hourly backups of the WAL files. Replaying the logs then allows you to restore your database to a set hour.
PITR is under development and may well be in the next version, but there are no guarantees. See the archives of the hackers list for details.
> What about the external
storage arrays?
These are big boxes of disks with e.g. a firewire connection. Expensive, and probably not what you're after.
If you want some insurance against failure of your database server you could look at replication. There are a number of solutions, one currently under development is called "slony" - googling and searching the archives should get you details, I believe it's in testing at the moment.
-- Richard Huxton Archonet Ltd
---------------------------(end of broadcast)--------------------------- TIP 3: if posting/reading through Usenet, please send an appropriate subscribe-nomail command to [EMAIL PROTECTED] so that your message can get through to the mailing list cleanly