FYI this is the enforcer function we wrote for our implementation- basically a rewrite of request.requires_https():
def force_https(trust_proxy = False): """ Enforces HTTPS in appropriate environments Args: trust_proxy: Can we trust proxy header 'http_x_forwarded_proto' to determine SSL. (Set this only if ALL your traffic comes via trusted proxy.) """ # If local host, exit: if request.env.remote_addr == "127.0.0.1": return # If cronjob or scheduler, exit: cronjob = request.global_settings.cronjob cmd_options = request.global_settings.cmd_options if cronjob or (cmd_options and cmd_options.scheduler): return # If already HTTPS, exit: if request.env.wsgi_url_scheme in ['https', 'HTTPS']: return # If HTTPS request forwarded over HTTP via a SSL-terminating proxy, exit: if trust_proxy and request.env.http_x_forwarded_proto in ['https', 'HTTPS' ]: return # Redirect to HTTPS: redirect(URL(scheme='https', args=request.args, vars=request.vars)) On Friday, September 21, 2012 9:53:36 AM UTC-4, Yarin wrote: > > The completely naive approach would be to do: > > if request.env.http_x_forwarded_for and \ > request.env.http_x_forwarded_proto in ['https', 'HTTPS']: > # Is HTTPS... > > But you cannot detect whether proxied traffic is real because headers are > unreliable. Instead it is up to the user to securely set up a server behind > a proxy and set the .is_proxied flag themselves. > > *Example:* > We put our app server behind an SSL-terminating load balancer on the > cloud. The domain app.example.com points to the loadbalancer, so we > configure app server's Apache to allow traffic from that domain only, and > block any outside direct traffic. Then we set *auth.settings.is_proxied*to > tell web2py "this proxy traffic is legit" > > HTTPS/443 requests will hit the loadbalancer, and be transformed to > HTTP/80 traffic with *http_x_forwarded_for* and > *http_x_forwarded_proto*headers set. Now we can confidently check: > > if auth.settings.is_proxied and \ > request.env.http_x_forwarded_proto in ['https', 'HTTPS']: > # Is HTTPS... > > In other words *http_x_forwarded_for* header is useless and you can't mix > direct and proxied traffic. To be able to handle proxy-terminated SSL, we > need to know that *all* the traffic is via a trusted proxy. > > > On Friday, September 21, 2012 8:40:35 AM UTC-4, Massimo Di Pierro wrote: >> >> Can you suggest a way to detect that? >> >> On Thursday, 20 September 2012 13:56:55 UTC-5, Yarin wrote: >>> >>> @Massimo - that'd be great. >>> >>> One more kink to throw in is recognizing proxied SSL calls. This >>> requires knowing whether you can trust the traffic headers (e.g. having >>> apache locked down to all traffic except your load balancer), so maybe we >>> need a trust_proxied_ssl or is_proxied setting somewhere? >>> >>> if request.env.http_x_forwarded_for and request.env.http_x_forwarded_proto >>> in ['https', 'HTTPS'] and auth.settings.is_proxied: >>> >>> >>> >>> On Thursday, September 20, 2012 12:52:22 PM UTC-4, Massimo Di Pierro >>> wrote: >>>> >>>> I think we should do something like this. >>>> >>>> I think we should have auth.settings.force_ssl_login >>>> and auth.settings.force_ssl_login. >>>> We could add secure=True option to existing requires validators. >>>> >>>> This should not be enforced from localhost. >>>> >>>> >>>> On Thursday, 20 September 2012 09:07:14 UTC-5, Yarin wrote: >>>>> >>>>> A proposal for improving SSL support in web2py >>>>> >>>>> For authenticated web applications, there are two "grades" of SSL >>>>> implementions: Forcing SSL on login, vs forcing SSL on the entire >>>>> authenticated session. >>>>> >>>>> In the first case, HTTPS is forced on login/registration, but reverts >>>>> back to HTTP upon authentication. This protects against passwords from >>>>> being sent unencrypted, but won't prevent session hijacking as the >>>>> session >>>>> cookie can still be compromised on subsequent HTTP requests. (See >>>>> Firesheep <http://codebutler.com/firesheep> for details). >>>>> Nonetheless, many sites choose this approach for performance reasons, as >>>>> SSL-delivered content is not cached by browsers as efficiently (discussed >>>>> on 37signals >>>>> blog<http://37signals.com/svn/posts/1431-mixed-content-warning-how-i-loathe-thee> >>>>> ). >>>>> >>>>> In the second case, the entire authenticated session is secured by >>>>> forcing all traffic to go over HTTPS while a user is logged in *and*by >>>>> securing the session cookie so that it will only be sent by the browser >>>>> over HTTPS. >>>>> >>>>> (Also discussed in web2py users group - Auth over >>>>> SSL<https://groups.google.com/d/msg/web2py/7qoHMs-4Va8/jRFOqYHri4gJ> >>>>> ) >>>>> >>>>> web2py should make it easier to deal with these scenarios. I just >>>>> implemented a case-1 type solution and it took quite a bit of work. >>>>> >>>>> Moreover, web2py currently provides two SSL-control functions, which, >>>>> taken on their own, can lead to problems for the uninitiated: >>>>> >>>>> - session.secure() will ensure that the session cookie is only >>>>> transmitted over HTTPS, but doesn't force HTTPS, so that for any >>>>> subsequent >>>>> session calls made over HTTP will simply not have access to the auth >>>>> session, but this is not obvious (Correct me if I'm wrong) >>>>> - request.requires_https() (undocumented?) is a misnomer, because >>>>> if forces HTTPS but then assumes a case-2 scenario and secures the >>>>> session cookie >>>>> >>>>> >>>>> *Proposals:* >>>>> >>>>> - SSL auth settings >>>>> - auth.settings.force_ssl_login - Forces HTTPS for >>>>> login/registration >>>>> - auth.settings.force_ssl_session - Forces HTTPS throughout an >>>>> authenticated session, and secure the session cookie (If True, >>>>> force_ssl_login >>>>> not necessary) >>>>> - Other more granular controls >>>>> - @requires_https() - decorator for controller functions that >>>>> forces HTTPS for that function only >>>>> - 'secure=True' option on forms ensures submission over HTTPS >>>>> >>>>> >>>>> >>>>> --