On Mon, May 23, 2011 at 08:50:55AM -0500, Anthony Liguori wrote: > >>The actual value of the alert will surprise you :-) > >> > >>Integers in Javascript are actually represented as doubles > >>internally which means that integer constants are only accurate up > >>to 52 bits. > >> > >>So really, we should cap integers at 32-bit :-/ > >> > >>Have I mentioned recently that I really dislike JSON... > > > >NB, I am distinguishing between JSON the generic specification and > >JSON as implemented in web browsers. JSON the specification has *no* > >limitation on integers. > > The spec has no notion of integers at all. Here's the relevant > text. Note that the BNF only has a single entry point for numbers. > It does not distinguish between integers and floating point numbers. > Also, the only discussion of valid numbers is about whether the > number can be represented as a rational number. I think the only > way to read the spec here is that *all* numbers are meant to be > represented as floating point numbers.
I don't agree. It means that JSON as a format can represent arbitrary numbers, whether rational or not. Interpretation as 32/64/128 bit floating point, or as 16/32/64 bit integers is entirely a matter for the parser. The only issue is whether the implementation parser can hold the numbers without loosing precision. eg, so if JavaScript gets a number which doesn't fit in 52-bits, it'll loose some precision due to floating point storage. Nothing in the JSON spec requires other languages to throw away precision when parsing or formatting JSON. Daniel -- |: http://berrange.com -o- http://www.flickr.com/photos/dberrange/ :| |: http://libvirt.org -o- http://virt-manager.org :| |: http://autobuild.org -o- http://search.cpan.org/~danberr/ :| |: http://entangle-photo.org -o- http://live.gnome.org/gtk-vnc :|