Hi all,

There's been a lot said about root store divergence and fragmentation. We
discussed this quite a bit in the interim, but with the continued interest
in the topic, and some arguments being brought up on repeat, I wanted to
clear some misconceptions, in a separate thread to avoid cluttering the
main one.

First, to correct a misrepresentation: this draft is not a veiled attempt
to completely diverge from the Web PKI and fragment the ecosystem. Section
7.6 is not "guarded language" to this effect. That simply does not make
sense on even a technical level. TAI is not needed to enable this.
Negotiating a few private CAs is easy. The existing certificate_authorities
extension works just fine to enable the hypothesized harmful fragmentation,
yet this has not manifested. Rather, this draft is trying to meet the needs
of existing public PKIs, with their independently-maintained but
vaguely-common large lists of multi-vendor CAs.

As for Section 7.6, I wrote that text. It was describing how non-browser
clients can have different needs from browsers, as folks in this WG
frequently bring up the importance of non-browser-centricity for TLS. It
doesn't name specific example players simply because I think doing that in
an internet draft is disrespectful.

Having spent my whole career on user security in this space, when things go
wrong, it's common to hear, no, fixing this will cause this other distant
bit of infrastructure to fail! If users on the web and this critical
infrastructure truly have such different needs, our systems should have the
tools to reflect that, instead of pitting them against each other. Indeed
several root programs are already moving to purpose-specific PKI
hierarchies. PKIs translate an application's security and deployment needs
into policy and trust anchor choices. When the underlying needs are truly
different, it will naturally be reflected in PKI requirements.

And then there's the unavoidable time-based divergence. Clients last
updated a year ago will only reflect what happened then. PKIs make
subjective predictions about external entities ("I think this CA will only
sign correct things"). These predictions can and do go wrong. When they do,
clients must change, and we need a transition strategy. Old clients will
fall off as they upgrade, but across the whole Internet, it takes time. We
cannot just declare them non-existent, or demand that other services stop
supporting them. And we've heard from server operators in the interim that
the existing tools, including cross-signs, aren't sufficient for their
needs. The interim discussed these dynamics in more detail, and the
consequences of continuing to pit security and availability against each
other, rather than letting them work together.

So those are instances of small, yet impactful divergences that exist
today. I think what people are concerned about is broad, unnecessary and
rampant divergence. But two clients of comparable type and age (e.g. two
up-to-date browsers) diverging for the sake of differentiation is not
anyone's interests. All the incentives here remain. Clients do not want it
to be difficult to serve them. That shared goal is itself the reason for
this work. Giving server operators the tools they need to weather some
divergence will not suddenly make it viable to maintain hundreds of
certificates for every client and every locality. That means there remain
reasons to align. And, to reiterate, a client wishing to use a private PKI
already has the tools to negotiate it in TLS.

There's also been talk of moving the ecosystem forward together, but
allowing faster-moving clients to clear security issues is what enables
risk-averse clients to follow. With or without TAI, if client A disables
TLS 1.0 or removes a bad CA, no amount of fate sharing will make client B
more secure. What makes client B more secure is them mirroring the change.
The sooner that change sticks in A, the more the rest of the ecosystem will
be compatible with it, which makes it easier for client B to match. That
lets the transition complete more quickly and reduces this time-based
divergence. Tying the faster clients to the slower ones doesn't bring the
slower ones forward. It slows the whole ecosystem down.

All this is a dynamic we've explored throughout TLS. It's the story of all
parameter negotiation. We need the lever for transitions and for the cases
where needs truly differ, but we still face pressures to keep the
variations in check. We've never treated PKIs as different here. We had a
CA list in CertificateRequest in SSL3 through TLS 1.2. We had ClientHello
trusted_ca_keys in RFC 4366 and renewed in RFC 6066. We generalized the CA
list to certificate_authorities in RFC 8446. We had a whole interim on the
topic and reaffirmed this.

As we all established at the interim, it does not make sense to sacrifice
both security and availability, out of fear that somehow limiting trust
anchor negotiation to closed ecosystems is what stands between open
ecosystems becoming closed.

David
_______________________________________________
TLS mailing list -- tls@ietf.org
To unsubscribe send an email to tls-le...@ietf.org

Reply via email to