.
Btw. setBinaryStream() should really throw an SQLException, if
in can not read as many bytes as expected from the InputStream.
Otherwise the application might silently loss data.
Regards
Martin
--
Martin Holz <[EMAIL PROTECTED]>
Softwareentwicklung / Vernetztes Studium - Chem
On Friday 09 January 2004 09:46, Kris Jurka wrote:
> On Wed, 7 Jan 2004, Martin Holz wrote:
> > Hello,
> >
> > org.postgresql.jdbc1.AbstractJdbc1Statement.setBinaryStream()
> > in postgresql 7.4.1 wrongly assumes, that
> > java.io.InputStream.read(byte[] b,int offs
On Friday 09 January 2004 09:46, Kris Jurka wrote:
> On Wed, 7 Jan 2004, Martin Holz wrote:
> > Hello,
> >
> > org.postgresql.jdbc1.AbstractJdbc1Statement.setBinaryStream()
> > in postgresql 7.4.1 wrongly assumes, that
> > java.io.InputStream.read(byte[] b,int offs
here are two options
a) Throw a exception
b) Silently send all bytes, that are available.
I don't really understand what the length argument is good for
and think, that this is a design flaw made by Sun. I would prefer
a), but I am not sure here.
Martin
--
M
Kris Jurka <[EMAIL PROTECTED]> writes:
> On Wed, 7 Jan 2004, Martin Holz wrote:
>
> > Hello,
> >
> > org.postgresql.jdbc1.AbstractJdbc1Statement.setBinaryStream()
> > in postgresql 7.4.1 wrongly assumes, that
> > java.io.InputStream.read(byte[] b,int