I've been looking at the code, and I noticed that (for some reason) the code for streaming a JSONObject invokes PrintWriter.flush() at the end. The code for streaming a partial page render response (which is also JSON) invokes PrintWriter.close(). I believe the later is correct. It is possible that flush() does not truly flush and that the end of the gzip stream never gets sent to the client, causing problems.
I'm trying to figure out how to test this hypothesis, but I'm not sure how to get the current code to fail under test conditions. On Fri, Sep 17, 2010 at 3:59 AM, Michael Dukaczewski <m.dukaczew...@tu-bs.de> wrote: > Hi Howard and all, > > thanks for your reply. I am always impressed how flexible Tapestry is. > The activation of the compression for JSON worked, but I was struck > by TAP5-469 and started some research. > Regarding your blog post > (http://tapestryjava.blogspot.com/2009/04/is-gzip-compression-compatible-with.html): > I tried several other combinations that worked fine, so I'm convinced > that GZIP compression in combination with JSON content basically works. > Thus, the error seems to be somewhere in the Tapestry code. My research > has shown that the Tapestry gzip stream sometimes does not finish > correctly when sending json. Therefore, the data is broken when it > arrives at the client. I have now solved the problem the following way > and it works very well: > > I have changed my event handler from: > > JSONObject onAction() {return getJson();} > > to: > > StreamResponse onAction() {return new JSONResponse(getJson());} > > and created the following wrapper: > > public class JSONResponse implements StreamResponse { > > private static final int HEADER_SIZE = 20; > private static final int MIN_DATA_SIZE = 512; > private static final String CHARSET = "UTF-8"; > > private byte[] data; > private boolean compress; > > public JSONResponse(JSONCollection json) { > try { > data = json.toCompactString().getBytes(CHARSET); > compress = data.length >= MIN_DATA_SIZE; > } catch (UnsupportedEncodingException e) { > // should never happen! > } > } > > @Override > public String getContentType() { > return "application/json; charset="+CHARSET; > } > > @Override > public InputStream getStream() throws IOException { > if (!compress) { > return new ByteArrayInputStream(data); > } > ByteArrayOutputStream out = new ByteArrayOutputStream( > expectedCompressedSize(data.length)); > GZIPOutputStream gzip = new GZIPOutputStream(out); > gzip.write(data); > gzip.close(); > byte[] gzipedData = out.toByteArray(); > return new ByteArrayInputStream(gzipedData); > } > > @Override > public void prepareResponse(Response response) { > if (compress) { > response.setHeader("Content-Encoding", "gzip"); > } > } > > private int expectedCompressedSize(int size) { > return (size >> 2) + HEADER_SIZE; > } > > } > > > Regards, > Michael > > > > Am 16.09.2010 18:03, schrieb Howard Lewis Ship: >> Also, make sure you disable JSON pretty printing! Most of a JSON >> response is now whitespace when in development mode. >> >> On Thu, Sep 16, 2010 at 8:59 AM, Howard Lewis Ship <hls...@gmail.com> wrote: >>> It's a bit kludgey, but you could decorate the >>> ResponseCompressionAnalyzer service, something like: >>> >>> public ResponseCompressionAnalyzer >>> decorateResonseCompressionAnalyzer(final ResponseCompressionAnalyzer >>> delegate) >>> { >>> return new REsponseCompressionAnalyzer() { >>> public boolean isGzipSupported() { return delegate.isGzipSupported(); } >>> public boolean isCompressable(String contentType) { >>> if (contentType.equals("application/json")) return true; >>> >>> return delegate.isCompressable(contentType); >>> } >>> }; >>> >>> } >>> >>> On Thu, Sep 16, 2010 at 7:50 AM, Michael Dukaczewski >>> <m.dukaczew...@tu-bs.de> wrote: >>>> I know. I have been following the topic. But now I have the problem that >>>> I have to transfer very large JSON objects. The application on which I >>>> am working is just for a small group of people (intranet) where I can >>>> make browser decisions. With luck, I can find a configuration that works >>>> well in my case with gzip compression. So is there a way to reactivate it? >>>> >>>> >>>> Am 16.09.2010 16:12, schrieb Thiago H. de Paula Figueiredo: >>>>> On Thu, 16 Sep 2010 10:40:08 -0300, Michael Dukaczewski >>>>> <m.dukaczew...@tu-bs.de> wrote: >>>>> >>>>>> thanks for your answer, but that does not help me. >>>>>> Is there a simple workaround to reactivate gzip compression for json? >>>>> >>>>> It was disabled because it cause problems in some browsers. >>>>> >>>> >>>> --------------------------------------------------------------------- >>>> To unsubscribe, e-mail: users-unsubscr...@tapestry.apache.org >>>> For additional commands, e-mail: users-h...@tapestry.apache.org >>>> >>>> >>> >>> >>> >>> -- >>> Howard M. Lewis Ship >>> >>> Creator of Apache Tapestry >>> >>> The source for Tapestry training, mentoring and support. Contact me to >>> learn how I can get you up and productive in Tapestry fast! >>> >>> (971) 678-5210 >>> http://howardlewisship.com >>> >> >> >> > > --------------------------------------------------------------------- > To unsubscribe, e-mail: users-unsubscr...@tapestry.apache.org > For additional commands, e-mail: users-h...@tapestry.apache.org > > -- Howard M. Lewis Ship Creator of Apache Tapestry The source for Tapestry training, mentoring and support. Contact me to learn how I can get you up and productive in Tapestry fast! (971) 678-5210 http://howardlewisship.com --------------------------------------------------------------------- To unsubscribe, e-mail: users-unsubscr...@tapestry.apache.org For additional commands, e-mail: users-h...@tapestry.apache.org