It's not as simple as reading n% of the bit-stream – the image needs to be encoded using tiles so a tile-aware decoder can simply read only the necessary levels. This is very popular in the library community because it allows a site like e.g. http://chroniclingamerica.loc.gov/ to serve tiles for a deep-zoom viewer without having to decode a full 600 DPI scan. This is in common usage but not using open-source software because the venerable libjasper doesn't support it (and is excruciatingly slow) but the newer OpenJPEG added support for it so it's now possible without relying on a licensed codec.
As far as transfer efficiency goes, it's slightly more overhead with the tile wrappers but not enough to come anywhere close to cancelling out compression win from using JP2 instead of JPEG. For those of us running servers, it's also frequently a win for cache efficiency versus separate images – particularly if a CDN miss means you have to go back to the origin and your stack allows streaming the cached initial portion of the image while doing byte-range requests for the other half. Chris _______________________________________________ dev-platform mailing list dev-platform@lists.mozilla.org https://lists.mozilla.org/listinfo/dev-platform