On Monday 18 July 2016 13:08:43 Hanno Böck wrote:

> * There don't seem to be any straightforward tools that test for
>   version intolerance. The SSL Labs test does detect version
>   intolerance, but it's limited to public facing https servers and it
>   doesn't seem to detect some of the weirder variations (as described
>   above). There's also a test in Hubert Kario's tlsfuzzer, but I've
>   been unable to get it to work. I tried to create a test myself, but
>   the results were highly erratic and I'm not sure why.

I hope that the updated README helps about this (just pushed the changes),
tell me if anything else is missing

feel free to send the problems with the test you wrote yourself
(but please remember, the note in README applies - it's an alpha, so
no API stability guarantees)

> * We don't have good data on the issue. The latest numbers I could find
>   came from Ivan Ristic in 2013 [4], and from David Benjamin we know he
>   considers the problem to be large enough that version fallbacks are
>   inevitable. That's far from good data. We also don't seem to have any
>   public list of affected vendors, devices and webpages.

I'm running a test against the Alexa Top 1 million /right now/ using this
code:
https://github.com/jvehent/cipherscan/pull/109

In general it's a rough check, i.e. it's only a check for client hello
version intolerance, irrespective of record layer version, ciphers or
extensions (it sends multiple probes, based on different clients and
then modified)

I've also added an Xmas-client hello message[1] that aims to trigger as
many intolerancies as possible, but it's mostly just to single out
some weirder servers.

e.g. some servers will just close connection if you send them an SNI
name they don't expect

 1 - similar to https://en.wikipedia.org/wiki/Christmas_tree_packet

> I want to try to work on some of those issues in the near future.
> Roughly I'd like to see that we work on a plan to reduce TLS brokenness
> in general and in particular - right now as this is an issue affecting
> the deployment of TLS 1.3 - Version intolerance.
> 
> Things that I think we could and should do:
> * Talk to developers of test tools (sslyze, testssl, openvas, but also
>   commercial tools etc.) that they include appropriate tests in their
>   tools and warn about these issues. Also it'd be great if e.g. things
>   like the google webmaster tools or any other public test tools for
>   webservers/websites could test for this.

+1, especially TLS 1.3 client hello version intolerance should be
an outright failure of any TLS test

> * Get some data from internet wide scans and make it public. Maybe have
>   a public shame list of the top X pages breaking TLS.

I'm working on it. I'll make the scan data public, but I'll leave the
shaming part to somebody else

> * In general, more and better detailed documentation of this and
>   similar issues, also raise this as a potential research topic.

+1 again, I think we should do a more "negative" reading of the TLS standard
(as in what the peer should do if the other side does not behave as
expected) both to specify what alerts should be sent, when, and when
such "misbehavior" MUST NOT trigger any failure in the other side

-- 
Regards,
Hubert Kario
Senior Quality Engineer, QE BaseOS Security team
Web: www.cz.redhat.com
Red Hat Czech s.r.o., Purkyňova 99/71, 612 45, Brno, Czech Republic

Attachment: signature.asc
Description: This is a digitally signed message part.

_______________________________________________
TLS mailing list
TLS@ietf.org
https://www.ietf.org/mailman/listinfo/tls

Reply via email to