Dear David,

I have to start by apologizing. It's not until now on reading your
email that I've come to the realization of what the issues were with
at least the early negotiation proposals in a way that makes them
clearly articulateable, and I think earlier on it would have been more
fruitful for me to have realized them. I've also not necessarily fully
digested the latest proposal.

If we were to have used certificate_authorities to negotiate, every
client would be sending a list of all the authorities it recognized,
servers would intersect them with what they had, and sending back the
result or an error if there were none, perhaps with some indicative
information of what they had on hand. In this world a new client would
be just as able to send such a list as any other, and servers could
recognize shifts in support and change the set of CAs they use
accordingly. There would be no additional mechanism inflected penalty:
if server A and client B can connect, they are able to determine this
fact, regardless of whatever deviations from commonality each one has
in their contribution.

A negotiation where what is advertised is an inherently opaque
pointer, and where each side must maintain an idea of what that could
mean, does not have this property. Servers have to explicitly act to
support the new identifier by getting a configuration that includes
it. Whether or not this is indirectly away as part of ACME doesn't
really change the equation. New clients can't count on server support,
unless they advertise an already existing value. There's been various
ways to express deltas to try to solve this, but they all involve
paying a penalty for deviation.

We've been through this before with the web platform looking more at
feature detection than painful UserAgent sniffing as browsers keep
telling all sorts of lies on top of lies.

There's also the fact that we've gone back and forth about how to
negotiate parameters, regrouping the possible dimensions a few times
and in different directions. We also have to deal with OS stores; this
bit us with TLS 1.2 where the set of supported algorithms had to split
out leafs from the rest of the chain since different implementations
could be used. I'm not sure that starting off with a tight coupling of
a bunch of dimensions is the right idea, but don't feel as strongly
about that.

The dynamic I'm worried about most then isn't fracturing: as you point
out there are some countervailing forces where people want easy
support. Rather it's that we artificially drive up the price of
picking different CAs than the dominant implementation.

Sincerely,
Watson

_______________________________________________
TLS mailing list -- tls@ietf.org
To unsubscribe send an email to tls-le...@ietf.org

Reply via email to