> It probably depends on what you call "serious". Anyway, the project I am > working on is a online community for alternate investments and is built > around a PostgreSQL (first 7.0, now 7.1) database: it's > <http://village.albourne.com> but unfortunately most of it is limited > only to subscribers so there is not a lot db-related to see. It's > PostreSQL + Apache + mod_perl on Digital Unix. > I would define "serious use" as use in transactional applications where the loss of data input by users is a very bad thing, and the uptime requirements are 24x7, with availability requirements overall of >99%. <p> As an example, and where most of my past experience has been, consider a reservation system for an airline or hotel chain. Such a system may have hundreds to thousands of transactions per second. More importantly, tens per second of those transactions which must not be lost - i.e. reservations/changes/cancellations - and which are worth real money. Losing, say, 15 minutes of these is a catastrophe. Note also that transactions are not equivalent to page views - in this case a single "page view" would result in a series of many database operations to generate a single response. <p> An even tougher example would be an online financial system such as an ATM debit system. In that case, you can hand someone a lot of money as a result of a transaction. Loss of that data is exactly loss of the money! <p> In the case of PostgreSQL, as far as I can tell, one could lose all data since the previous dump if one lost the database media. In Oracle or Informix, that is *not* true, because they can do a point-in-time restore from the last full save, based on the WAL's. ---------------------------(end of broadcast)--------------------------- TIP 3: if posting/reading through Usenet, please send an appropriate subscribe-nomail command to [EMAIL PROTECTED] so that your message can get through to the mailing list cleanly