On 1/11/12 12:16 , Máté Koch wrote:
Hello All,
I'm developing an app which stores the data in file system database. The data
in my case consists of large python objects, mostly dicts, containing texts and
numbers. The easiest way to dump and load them would be pickle, but I have a
problem with it: I want to keep the data in version control, and I would like
to use it as efficiently as possible. Is it possible to force pickle to store
the otherwise unordered (e.g. dictionary) data in a kind of ordered way, so
that if I dump a large dict, then change 1 tiny thing in it and dump again, the
diff of the former and the new file will be minimal?
If pickle is not the best choice for me, can you suggest anything else? (If
there isn't any solution for it so far, I will write the module of course, but
first I'd like to look around and make sure it hasn't been created yet.)
Json kinda sucks. Try yaml.
If your data is simple enough, you can just write and read your own
format. Sort it first and you're golden.
You might also try sorted dicts. I don't know if those will come out of
pickle any differently than regular dicts, but it's worth trying.
You can also write your own serializer for any of the previously
mentioned serializers. If you sort during serialization, then they'll
be sorted in the disk file.
--rich
--
http://mail.python.org/mailman/listinfo/python-list