Jesse B. Crawford: > [Explanation of EV certificates] > > Now, two HUGE caveats: > > 1) Facebook does not actually have an EV cert for their hidden > service! they have an OV cert with O=Facebook, Inc. but for various > (largely political but still largely valid) reasons Firefox does not > trust the O field and considers OV certificates no better than DV.
Facebook doesn't have an EV certificate at all! They same cert is present on their clear web and HS sites. And, IMHO, I don't think that this is the list for solving the CA problem. If we can get a visibly encrypted page displayed to a user without a warning then that's probably good enough. > So why does Facebook use SSL..? I don't know, perhaps they think > that providing the O field is significant (I'd say that it isn't > because browsers don't tell the user about it), or perhaps it's just > for consistency with their open web presence. https://www.facebook.com/notes/protect-the-graph/making-connections-to-facebook-more-secure/1526085754298237 "We decided to use SSL atop this service due in part to architectural considerations - for example, we use the Tor daemon as a reverse proxy into a load balancer and Facebook traffic requires the protection of SSL over that link. As a result, we have provided an SSL certificate which cites our onion address; this mechanism removes the Tor Browser's “SSL Certificate Warning” for that onion address and increases confidence that this service really is run by Facebook. Issuing an SSL certificate for a Tor implementation is - in the Tor world - a novel solution to attribute ownership of an onion address; other solutions for attribution are ripe for consideration, but we believe that this one provides an appropriate starting point for such discussion." > 2) Don't think that I'm an advocate of the present CA infrastructure, > it's a terrible approach to the problem. But it is the approach that > we have right now. :) > > Overall, what should be done? Layering SSL on top of the hidden > service system is not a good solution to the problem, but I'm also not > comfortable with just saying "users should be smart enough to validate > that they have the right address" and relying on the difficulty in > producing a near-collision address (keep in mind that many "important" > hidden services do not have a vanity address at all or have only > generated an address with a small number of chosen characters). I certainly agree that the CA system is not ideal and that we must assume that users know nothing about security (and rightly so), but we've struggled to get the current system working somewhat reliably for end users for nigh on 20 years and it will be difficult to suddenly change direction. > Probably the best solution is that hidden services that are attractive > for phishing/misdirection (just about anyone doing business in bitcoin > for example) should implement measures like showing secrets to the > user to prove the service identity, and users should of course > beware. But this solution requires service operator and user > participation, making it far less than ideal. Where we can assume a technically literate user then alternate solutions should be possible, but I do wonder whether it is worth the effort. Even for something like Bitcoin where we can currently assume that the user has some understanding of security, a different solution would have a major impact if/when the service becomes popular with less technical users. Any solution should just work. Chrome is moving towards this with certificate pinning: displaying "I'm sorry Dave, I'm afraid I can't let you do that" when the certificate doesn't match expectations. An alternative may be to somehow tie a hidden service's private key to that of the certificate and then allow TBB or Tor to perform the validation. I'm not sure how it would work technically, but it seems like a logical step. -- kat -- tor-talk mailing list - [email protected] To unsubscribe or change other settings go to https://lists.torproject.org/cgi-bin/mailman/listinfo/tor-talk
