On Thu, Jan 28, 2010 at 7:10 PM, Eran Hammer-Lahav <e...@hueniverse.com>wrote:
> (For the sake of simplicity, I am going to refer to the Plain bearer token > with SSL/TLS as S-Plain) > > WRAP appeal is also its limitation and (strictly as written) it sacrifices > security for client ease. We still don’t know what OAuth 2.0 is (it is ** > not** WRAP, but likely to include ideas from it). > How is security sacrificed in the S-Plain method proposed for OAuth 2.0? I understand that WRAP makes sacrifices, but am I missing something in terms of moving a Plain bearer token always over SSL/TLS when making authentication requests? The problem I have with your approach is that you are mandating server > deployment and given that this is a security protocol, we have no business > telling servers what they MUST implement (they might consider S-Plain too > weak for their needs). There are many reasons why vendors would rather not > use the S-Plain option (you listed some of those) for every protected > resource request. There are also many reasons why vendors would only support > S-Plain and no signature option. > > My proposal allows you to do everything you have asked for and your > developers will have an easy life with the S-Plain option, or performance / > flexibility with one of the HMAC methods. But the advantage is that I do not > mandate Facebook to do anything it doesn’t want. Any OAuth 2.0 client will > work with Facebook. > > It is true that writing a fully compliant client will require more work. > But the whole point of the S-Plain option is that it requires **no** > additional library beyond having SSL/TLS available. So this is a vendor > support issue – if you want to enable a library-less client development, > offer S-Plain. This again, is a vendor choice. > Yes, but the tradeoff here is that a client will have S-Plain plus two or three other algorithms. The library size is increased, the complexity is increased, and it encourages servers (like Facebook) to develop separate libraries which only support the one or two methods they have chosen to use. Luke and I would like to avoid creating Facebook specific client libraries. (This is of course based on the assumption that the majority of implementations will adopt S-Plain. We then envision an advanced client library which also implements HMAC SHA-256.) Also keep in mind that the current proposal moves the algorithm choice to > the token request/issue phase. Each token will only work with a single > algorithm (which is better cryptographic hygiene). So a vendor can choose to > allow the client to pick the algorithm they want to you, or just tell them > which one they are going to use. > > > > EHL > > > > > > *From:* Luke Shepard [mailto:lshep...@facebook.com] > *Sent:* Thursday, January 28, 2010 6:36 PM > *To:* Eran Hammer-Lahav; oauth@ietf.org > *Subject:* Re: [OAUTH-WG] Discussion of SSL as the primary means for OAuth > communication > > > > Thanks for the detailed reply, Eran. > > I think that your proposed design has it backwards: servers should bear > complexity and one-time costs, not clients. Much of why I think OAuth WRAP / > 2.0 is so appealing is that it dramatically simplifies life for developers. > If a client can support SSL and doesn’t want to deal with signatures, then > they never should have to. I want to be able to write a very lightweight > SSL-only client library that is still fully OAuth 2.0 compatible. > > I think instead, the consensus should be: > > - Support an extensible set of signature algorithms > - Servers required to support plaintext tokens over a secure channel > (SSL/TLS) > - Servers required to support at least one or a small subset of > encryption algorithms. Possible candidates include hmac-sha-1, > hmac-sha-256, > and rsassa.... > - Clients must support at least ONE of the required encryption methods > > > If a client supports at least one of the required methods (SSL or one of > the algorithms), then it can be considered fully OAuth 2.0 compliant. If it > only supports encryption methods that are not in the required list, then it > is partially compliant – that is, it may work with a specific vendor that > advertises support for that algorithm, but it won’t work with all servers. > As algorithms gradually change over time, client libraries can be upgraded > while servers need to maintain backwards compatibility. (For example, > Facebook will deprecate MD5 as its sig algorithm, but will continue to > support it for the clients using it in the wild) > > As a concrete example of what can be written when you don’t need to worry > about encryption, see: > http://github.com/lshepard/oauth-wrap-demo/tree/master/src/wrap.js > If I were to modify that JS to accommodate multiple signature algorithms, > then I would need to include at least another 1-2k of JavaScript for each > method just in case. That would be fairly ridiculous. > > > On 1/28/10 1:41 PM, "Eran Hammer-Lahav" <e...@hueniverse.com> wrote: > > Thanks Luke. This is a useful analysis. > > I believe we have already reached consensus in this regard: > > * Support an extensible set of signature algorithms > * Define hmac-sha-1, hmac-sha-256, and rsassa-pkcs1-v1.5-sha-256 > * Define a plain method for bearer tokens and either require SSL/TLS or > strongly recommend it (open issue) > > (if these assumptions are incorrect, raise your hand now) > > The specification is going to define the four methods above (and possibly > more) as **must implement**. This does not mean vendors have to support > all of them, but that any **client** library or application claiming to be > ‘OAuth 2.0 compliant’ must implement all these methods. > > If a vendor issues tokens not using all these methods (which is more > likely), it does not have to implement all these methods and can still be > called ‘OAuth 2.0 compliant’ because any it can interop with **any** > compliant library. However, if such a vendor produces a client library for > its developers which only implements the methods deployed by the vendor, > that library is ‘OAuth 2.0 partially compliant’ (because it will not interop > with **all** vendors). > > In practice, from Facebook’s perspective, you will be able to deploy OAuth > 2.0 in a compliant way, while only implementing the methods that you find > suitable for your developers. It sounds like you will be using the plain > method over SSL and either hmac-sha-1 or hmac-sha-256. Since Facebook > usually makes things trivial for its developers, I expect Facebook to > produce libraries that will implement these two methods, but not others. > This will be a ‘Facebook API’ library that uses OAuth 2.0. It will be > partially compliant but this only means it is only guaranteed to work with > Facebook, but that it does so in a fully compliant way. > > EHL > > > > > > *From:* oauth-boun...@ietf.org > [mailto:oauth-boun...@ietf.org<oauth-boun...@ietf.org>] > *On Behalf Of *Luke Shepard > *Sent:* Thursday, January 28, 2010 7:30 AM > *To:* oauth@ietf.org > *Subject:* [OAUTH-WG] Discussion of SSL as the primary means for OAuth > communication > > In the discussions around the OAuth WRAP spec, one of the questions often > asked is, “why use SSL exclusively?” Several of us have done a lot of > thinking on it and I wanted to articulate my understanding of the pros and > cons of the approach for discussion. The use case I primarily have in mind > is that of a web service, like Facebook, Twitter, or Google services. Our > service is primarily authenticated via the Web, but we have a use for all of > the WRAP profiles (web app, client app, desktop, mobile). > > Overall, I think that the simplicity of using SSL outweighs all the > associated costs for most developers. However, we should offer plaintext > signatures as an optional performance enhancement for advanced developers. > > == Advantages of using SSL for API calls > > -- It’s overwhelmingly simpler for developers. > I’ve implemented OpenID and OAuth, and I’ve worked for years with > developers trying to handle signatures on the Facebook Platform. In my > experience, calculating signatures is one of the most complex and difficult > parts of an authentication protocol, and developers often get it wrong. By > moving that piece down the stack we can get it out of their way and let > developers focus on building their apps. > > -- Existing tools ecosystem > It’s not that SSL is a simpler encryption protocol than OAuth (it’s not) > but rather that commonly available tools almost universally support it – > every major web browser, as well as most libraries for making HTTP requests > (like curl) have built-in support for SSL. For OAuth 1.0, you need to use a > client library just to construct your very first request. > > -- Smaller client libraries > A good chunk of code in many client libraries is devoted to calculating and > verifying signatures. For example, the OpenID PHP library imports several > BigMath modules and encryption schemes. Even the relatively simpler Facebook > client library requires several functions to sign requests. This makes the > client libraries a black box and impedes understanding. > > Wouldn’t it be great if we could write a protocol that doesn’t even require > a client library to implement? If I could just make an authenticated API > request in my browser, as easily as with Basic Auth? > > == Disadvantages of using SSL for API calls > > -- Difficulties of debugging > > Both signatures and SSL present difficulties in debugging, but they tend to > be different. While with signatures you worry about composing the arguments > wrong or using the wrong algorithm, with SSL you worry about reading the > request over the wire. You can’t sniff a request, and to intercept it, you > need an HTTP proxy that understands SSL, and you need to worry about invalid > certificate errors. To aid in this, providers will probably offer a non-SSL > endpoint for debugging, but they may need to set up a sandbox environment to > prevent damage from tokens exposed in plaintext. > > -- Variable costs for providers > > Server CPU costs will increase when handling SSL requests – especially on > every API call instead of just at the auth stage. At scale, this can become > expensive, although it can be offset by using specialized hardware to > terminate the SSL connection. All the big companies I’ve talked to are > comfortable trading these higher costs for increased adoption due to the > simplicity. > > -- Fixed costs for smaller providers > > There is a fixed cost to obtaining and signing an SSL certificates, > although that has dropped in recent years such that an operator can have a > cert signed for a single domain for pretty cheap. > > -- Latency > > SSL connections take more time to establish than normal HTTP connections. > Servers can use specialized hardware to speed it up, but clients rarely do, > which means that for client-to-server API requests, there may be some higher > latency in each request. Smaller, mobile devices may be disproportionately > affected by this, but as they grow more powerful it’s less of a concern. > Already today newer phones can handle SSL just fine. > > -- Browser limitations for cross domain communication > > A more niche disadvantage is that some cross domain communication > techniques require the protocol of the parent page to match that of the > endpoint being queried. So for example, in some browsers, for some API > calls, it would be impossible to make an API call to an HTTPS endpoint from > a normal HTTP page. However, as browsers advance and HTML5 methods like > postMessage become more common, this will become less of an issue. > > -- Verifying information on the relying party > > If information is passed from a Service Provider to a Consumer through the > user’s browser, then that information cannot be verified without an API > call, unless a signature is provided. Similar to stateful vs. stateless mode > in the OpenID 2.0 spec, the signature can serve as a performance enhancement > to avoid an API call. > > == Providing a non-SSL option for the short head > > Most platforms tend to have a small number of fairly large developers and > then a really large number of smaller developers. The former are typically > experienced (or at least, they are by the time they get big) and have made a > large investment in the platform. The latter range from experienced > developers to amateurs to folks that would not typically program. For this > spec to be successful, it must meet the needs of both groups. > > Facebook is very interested in adopting OAuth WRAP / 2.0 / whatever because > we want to help our long tail developers use our platform more easily. For > the long tail, the simplicity provided by SSL will be crucial. It means > smaller or nonexistent client libaries, it means that developers can just > try out the API by typing it into a web browser, and it will help reduce > debugging and maintenance costs. > > However, for the short head of high volume developers, we probably want an > option to use something other than SSL to make secure requests. These > developers tend to have already solved the low hanging performance fruit, > and for them it may be a significant penalty to pay the SSL connection cost > on every request. To that end, I think it’s pretty important that OAuth 2.0 > support a method other than SSL as an option for advanced developers. But it > should be just that – an option, and only for advanced developers, so that > beginning programmers and folks learning an API don’t need to worry about > signatures when they just want to play around. > > Please let me know if I’m missing something or if my assumptions sound > incorrect. > > -Luke Shepard > Software Engineer, Facebook Platform > > _______________________________________________ > OAuth mailing list > OAuth@ietf.org > https://www.ietf.org/mailman/listinfo/oauth > >
_______________________________________________ OAuth mailing list OAuth@ietf.org https://www.ietf.org/mailman/listinfo/oauth