Hi all -

I wonder if anyone else has hit the file-size upload limits on GAE?

This limit is supposed to be 10MB which for my needs would be fine.
The GAE file system is read-only so I save upload data in the database
like this:

Field('file','upload',uploadfield='file_binary', ...
Field('file_binary','blob',...

This however imposes a further limit: The blob database type in GAE
has a maximum size of 1MB per blob, so this limits my maximum file
upload size to 1MB... Any file upload larger than this and I get the
error:

RequestTooLargeError: The request to API call datastore_v3.Put() was
too large.

Has anyone tried splitting and joining one file across several blobs
in GAE to allow uploads up to the 10MB limit? Is this theoretically
possible?

I have other options like just avoiding GAE for this particular app,
or for large files manually sync them across to the app file system as
part of deployment with appcfg.py. But I'd like to look into storing
one file split across blobs just to see if it can be done.

I can think of one other nasty hack which would involve building a
helper utility to programmatically call appcfg.py, so file uploads go
to the helper, which then syncs with the app file system on GAE. The
helper could be a separate website.. a browser plug-in .. or maybe
host appcfg.py in its own GAE app?! Taken to an extreme this could
give a gae hosted-app full read-write access to its own filesystem by
going through a helper proxy. Maybe a fun thing to hack up one day but
I just need to up 10MB file upload for now so I'll try stream
splitting / joining first.

- Alex

-- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to web...@googlegroups.com.
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en.

Reply via email to