On Thu, Dec 17, 2015 at 5:33 PM, Mike Hamburg <m...@shiftleft.org> wrote:

>
>
> On Dec 17, 2015, at 12:11 PM, Eric Rescorla <e...@rtfm.com> wrote:
>
>
>
> On Thu, Dec 17, 2015 at 3:02 PM, Hugo Krawczyk <h...@ee.technion.ac.il>
> wrote:
>
>> I have mentioned this in private conversations but let me say this here:
>> I would prefer that the nonces be explicitly concatenated to the handshake
>> hash.  That is,
>>
>> handshake_hash = Hash(
>>
>>                             client random            ||
>>
>>                             server random            ||
>>
>>                             Hash(handshake_messages) ||
>>
>> Hash(configuration) || )
>>
>>
>> The reason is that nonces are essential for freshness and session
>> uniqueness and I want to see them explicitly included in the
>> signed/mac-ed/KDF-ed information. I can envision a future variant/mode of
>> the protocol where parties do not transmit nonces but have a synchronized
>> state that they advance separately and use as nonces (e.g., for key
>> refreshing) - in such case the nonces would not be included in the
>> handshake-hash computation.
>>
>> So while the redundancy of having them twice in the handshake_hash
>> calculation may be annoying, this adds robustness to the security (and
>> analysis) of the protocol.
>>
>
> This change doesn't make implementation or specification significantly
> more difficult.
> Does anyone  else object or feel it makes analysis harder? :)
>
> -Ekr
>
>
> While I haven’t been following TLS 1.3 development all that closely, I
> will question this request.
>
> TLS is annoying to implement and analyze in part because it hashes
> more-or-less arbitrary parts of the handshake messages together, in
> arbitrary order, at arbitrary times.  Removal of all the explicit hashing
> of client/server random in TLS 1.3 makes it clearer what’s going on, and
> makes implementations simpler.
>

​How does removal of explicit hashing of the nonces make things clearer?
What are the things that are made clearer?

​


>  Some of the crypto operations still feel pretty arbitrary (particularly
> Finished), but things seem to be improving overall.  In this context, it
> feels like
> ​​
> adding client random and server random back to the hash is a regression.
>

​What do you mean by ​

​"
​
 adding client random and server random back to the hash is a regression
​"?
Why "back"? Were they removed? What's the regression?
You are probably not suggesting to omit them, right? Are you worried about
the redundancy of being hashed twice? Is it a security issue or an
implementation issue?
​
.​

>
> From an analysis point of view, the client and server random are parseable
> from Hash(handshake messages) because they are concatenated with framing
> information.
>

​They are parseable, but I am not sure they are *uniquely* parseable - a
fixed location in the stream does make them uniquely parseable.

​


>  But here, they are concatenated without framing information.
>

​The nonces are the main framing information - they are the (honest
parties') unique identifier of the handshake.

​


> So I don’t understand Hugo’s contention that the old scheme leads to
> trouble if the nonce changes sizes in a later version, and that the new
> scheme does not.  It seems to me that the reverse is more likely to be true.
>

​I'm clearly not following your argument.

​Hugo
​

>
> Cheers,
> — Mike
>
_______________________________________________
TLS mailing list
TLS@ietf.org
https://www.ietf.org/mailman/listinfo/tls

Reply via email to