In encoding/base32 there is, I believe, an off by one error in the 
calculation of the size of the buffer needed for DecodeLen() when padding 
is turned off.

// DecodedLen returns the maximum length in bytes of the decoded data  // 
corresponding to n bytes of base32-encoded data.  func (enc *Encoding) 
DecodedLen(n int) int {        if enc.padChar == NoPadding {           return n 
* 5 / 8        }       return n / 8 * 5  }  

Note that when n is 1, that leads to a DecodeLen() returning zero. 
Likewise, when n is 2, we get a DecodeLen of 1.

This leads to incorrect decoding, as the size of dbuf is wrong in 
DecodeString().

If needed, I can construct tests showing this, but what I have at the 
moment (how I stumbled across this) isn't going to
be very useful.

-- 
You received this message because you are subscribed to the Google Groups 
"golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to golang-nuts+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to