Mike Meyer wrote:
Yes, they are present no matter what representation you use. The question is - how do the answers change if you change the format. These days, cross-platform means you deal with length as well as endian issues. Or maybe you don't, depending on the db. I know the answers for text files (easy, easy, very, yes). Can you propose a db scheme that gets has the same answers?
I think I don't understand the question. If the database contains number "42" in a field typed "int32", in a row, and handles endianess well, why would I get a different number on different platforms?
(A side note about sqlite: it's actually weakly typed - you store and receive strings).
I hate to tell you this, but your XML solution would still consist of a bunch of one-of file formats for each and every purpose. Using XML just fixes the syntax for the file, not the semantics. Settling on XML (or JSON, or INI, or cap files, or ...) is sort of like settling on UTF, only less obviously a win. Sure, you get to use canned code that will turn you text file into a structure in memory. But you still have to figure out what it all means. As you say, the XML toolset is the real win. Smart editors, validators, schemas (which make the editors and validators even more powerful) are all good things. Most people don't really seem interested in this beyond editors. That's not really much of a win.
I agree that validation in XML is a strong point - but one of the reason people like text files is that they DON'T usually have validation features :)
| pro | contra ---------------------------------------------------------------------- XML | standard tools, validation, | evil manual parsing, bad rep | can embed multiple data | | structures in a standard way | ---------------------------------------------------------------------- text | standard tools, sometimes | no validation, manual parsing, | human readable | usually one data structure per | | file
signature.asc
Description: OpenPGP digital signature