On Tue, Dec 12, 2017 at 12:06 AM, Jeff Goldberg <j...@agilebits.com> wrote:
>
> In encoding/base32 there is, I believe, an off by one error in the
> calculation of the size of the buffer needed for DecodeLen() when padding is
> turned off.
>
> // DecodedLen returns the maximum length in bytes of the decoded data
>   // corresponding to n bytes of base32-encoded data.
>   func (enc *Encoding) DecodedLen(n int) int {
>   if enc.padChar == NoPadding {
>   return n * 5 / 8
>   }
>
>   return n / 8 * 5
>   }
>
>
> Note that when n is 1, that leads to a DecodeLen() returning zero. Likewise,
> when n is 2, we get a DecodeLen of 1.
>
> This leads to incorrect decoding, as the size of dbuf is wrong in
> DecodeString().
>
> If needed, I can construct tests showing this, but what I have at the moment
> (how I stumbled across this) isn't going to
> be very useful.

DecodedLen is supposed to be applied to the length of the encoded
data.  RFC 4648 says that the encoded data must be padded to be a
multiple of 8 bytes.  So when using padding, values of 1 or 2 should
never happen.  When not using padding, a minimum of two encoded bytes
is needed to represent a single decoded byte.  So a value of 1 should
never happen.

Perhaps DecodedLen should have had an error return, but it's too late
now.  There is no correct value to return to for an impossible input,
so returning 0 seems as good as anything.

Ian

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to