Good afternoon,

If its not too big a task you could even convert the data
structure to JSON which is quite a close match to what you
have now and the json module will help you read/write
to them.

I would agree with the JSON recommendation (until your data set grows to more than 10GB in size). Also, if you are operating with JSON, you can add elements and delete elements and the serialization bits don't care. This makes it more dynamic than a typical csv/tsv format. The application still has to know about the data type, of course.

I am not seeing JSON listed among python's standard libraries for version 2.4.4. Is this something that has to be independently installed?

Yes. The json module was 'only' added in Python 2.6. If you wanted to operate on JSON in Python 2.4, you had to use something called simplejson [0]. According to this post [0], the standard library json module is a(n older) release of the simplejson module.

Anyway, when I found myself stuck in Python-2.4 land (on the stock Python shipped with CentOS-5.x, for example), I often saw and wrote snippets like this [1]:

  try:  # -- if Python-2.6+, it's in STDLIB
      import json
  except ImportError:  # -- Python-2.4, simplejson
      import simplejson as json

And, then the rest of the program can operate just the same, regardless of which one you used.

Good luck,

-Martin

 [0] https://pypi.python.org/pypi/simplejson/
 [1] 
http://stackoverflow.com/questions/712791/what-are-the-differences-between-json-and-simplejson-python-modules

--
Martin A. Brown
http://linux-ip.net/
_______________________________________________
Tutor maillist  -  Tutor@python.org
To unsubscribe or change subscription options:
https://mail.python.org/mailman/listinfo/tutor

Reply via email to