I'm interested in writing two programs, A and B, which communicate using JSON. At a high level, A wants to transfer an array to B.
However, I would very much like to make it possible for A and B to run in parallel, so my current plan is to have A output and B read a *sequence* of JSON objects. In other words, instead of [ {"a": 0}, {"b":0}, {"c": 0} ] it would just send {"a": 0} {"b": 0} {"c": 0} I know about the raw_decode() object inside the json.JSONParser class, and that gets me most of the way there there. However, what I'm *not* sure about is the best way to get the input to the raw_decode() function, which expects a "string or buffer": >>> d = json.JSONDecoder() >>> d.raw_decode(sys.stdin) Traceback (most recent call last): ... File "json\scanner.py", line 42, in iterscan match = self.scanner.scanner(string, idx).match TypeError: expected string or buffer Now I'm not very familiar with the buffer and how it could be used (and whether a file or stdin could be used as one in an incremental fashion), but the best way I can come up with is the following: 1. Read a line of input 2. Try to decode it 3. If not, read another line, concatenate it to the end, and try again 4. etc. That seems... inelegant at least. Some other information: * I'm looking for a 2.7 solution ideally * I'd prefer not to use a different JSON library entirely * As suggested, I *am* willing to wait for a newline to do processing * However, I don't want to require exactly one object per line (and want to allow both multiple objects on one line and newlines within an object) Evan
signature.asc
Description: OpenPGP digital signature
-- http://mail.python.org/mailman/listinfo/python-list