> Assuming this can be done... are we sure that accessing S3 via REST
> from GAE does not exceed the time limit?

If I remember correctly, you can stream uploads directly to S3, which
bypasses GAE and avoids any timeout.  So from GAE, at request time,
you ask S3 for a unique upload url, then POST the form data directly
to that S3 url.  When the upload is completed, you can have S3
redirect the user to a url on your site that you specify.  (This is
the same way the YouTube browser upload API works).

Also note that GAE should be releasing large file support soon, which
should work similarly to S3.

Robin

On Jul 9, 3:55 pm, mdipierro <mdipie...@cs.depaul.edu> wrote:
> I agree. This is not possible now but I guess it can be accommodated.
> There are three pieces here that need to work together.
>
> 1) the ability to override the upload function
> 2) disable the web2py behavior on GAE to created a blob field for
> every upload file
> 3) design custom uploader and downloaded functions that works with S3
>
> Assuming this can be done... are we sure that accessing S3 via REST
> from GAE does not exceed the time limit? Before implementing something
> like this do we know this is worthwhile?
>
> Massimo
>
> On Jul 9, 3:38 pm, Fran <francisb...@googlemail.com> wrote:
>
> > On Jul 9, 9:36 pm, Fran <francisb...@googlemail.com> wrote:
>
> > > Seems like it would be a nice option to be able to have the /uploads
> > > folder be there.
>
> > Django has very flexible uploads 
> > handling:http://docs.djangoproject.com/en/dev/topics/http/file-uploads/
>
> > F
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"web2py Web Framework" group.
To post to this group, send email to web2py@googlegroups.com
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to