Package: kaffe
Version: 2:1.1.5-3
Severity: important

*** Please type your report below this line ***

java.net.URLConnection and/or java.net.HttpURLConnection report the
compressed content length according to the http headers when the webserver
sends a gzip compressed file.

However, when an InputStream for the content is created with
getInputStream(), the read() and available() methods will only return
uncompressed caracters for the length of the compressed payload. The raw
data is thus truncated.

A work-around is to prevent compression on the HTTP connection by forcing
the headers:

.setRequestProperty("Accept", "text/html, text/plain");
.setRequestProperty("Accept-Encoding","chunked;q=1.0");

But this seems to break something else with large files.

-- System Information:
Debian Release: 3.1
  APT prefers testing
  APT policy: (500, 'testing')
Architecture: i386 (i686)
Kernel: Linux 2.6.11.8bravo
Locale: LANG=C, LC_CTYPE=C (charmap=ANSI_X3.4-1968)

Versions of packages kaffe depends on:
ii kaffe-pthreads 2:1.1.5-3 A POSIX threads enabled version of

-- no debconf information



-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to