It depends on what you mean by database. If you want really fast I/O, try pytables. "PyTables is a hierarchical database package designed to efficiently manage very large amounts of data."
http://pytables.sourceforge.net/html/WelcomePage.html some more comments from the webpage: # High performance I/O: On modern systems, and for large amounts of data, tables and array objects can be read and written at a speed only limited by the performance of the underlying I/O subsystem. Moreover, if your data is compressible, even faster than your I/O maximum throughput (!). # Support of files bigger than 2 GB: So that you won't be limited if you want to deal with very large datasets. In fact, PyTables support full 64-bit file addressing even on 32-bit platforms (provided that the underlying filesystem does so too, of course). # Architecture-independent: PyTables has been carefully coded (as HDF5 itself) with little-endian/big-endian byte orderings issues in mind . So, you can write a file in a big-endian machine (like a Sparc or MIPS) and read it in other little-endian (like Intel or Alpha) without problems. # Portability: PyTables has been ported to many architectures, namely Linux, Windows, MacOSX, FreeBSD, Solaris, IRIX and probably works in many more. Moreover, it runs just fine also in 64 bit plaforms (like AMD64, Intel64, UltraSparc or MIPS RXX000 processors). -- http://mail.python.org/mailman/listinfo/python-list