Hello, 

I am trying to manage (upload and download) big files from google cloud 
storage width a web2py app deployed in GAE.

First step, I have adapted the 
example: 
https://cloud.google.com/storage/docs/json_api/v1/json-api-python-samples

I works fine in localhost. Here it is the controller: 

import argparse
import httplib2
import os
import sys
import json


from apiclient import discovery
from oauth2client import file
from oauth2client import client
from oauth2client import tools

# Define sample variables.
_BUCKET_NAME = 'mybucket'
_API_VERSION = 'v1'

# Parser for command-line arguments.
parser = argparse.ArgumentParser(
    description=__doc__,
    formatter_class=argparse.RawDescriptionHelpFormatter,
    parents=[tools.argparser])

# CLIENT_SECRETS is name of a file containing the OAuth 2.0 information for 
this
# application, including client_id and client_secret. You can see the 
Client ID
# and Client secret on the APIs page in the Cloud Console:
# <https://console.developers.google.com/>
CLIENT_SECRETS = os.path.join(os.path.dirname(__file__), 
'client_secrets.json')

# Set up a Flow object to be used for authentication.
# Add one or more of the following scopes. PLEASE ONLY ADD THE SCOPES YOU
# NEED. For more information on using scopes please see
# <https://developers.google.com/storage/docs/authentication#oauth>.
FLOW = client.flow_from_clientsecrets(CLIENT_SECRETS,
  scope=[
      'https://www.googleapis.com/auth/devstorage.full_control',
      'https://www.googleapis.com/auth/devstorage.read_only',
      'https://www.googleapis.com/auth/devstorage.read_write',
    ],
    message=tools.message_if_missing(CLIENT_SECRETS))

def index():
  cliente=CLIENT_SECRETS
  flow= FLOW
#    va= main(sys.argv)
   # Parse the command-line flags.
#  flags = parser.parse_args(argv[1:])
    # If the credentials don't exist or are invalid run through the native 
client
  # flow. The Storage object will ensure that if successful the good
  # credentials will get written back to the file.
  storage = file.Storage('sample.dat')
  credentials = storage.get()
  if credentials is None or credentials.invalid:
    credentials = tools.run_flow(FLOW, storage, flags)

  # Create an httplib2.Http object to handle our HTTP requests and 
authorize it
  # with our good Credentials.
  http = httplib2.Http()
  http = credentials.authorize(http)

  # Construct the service object for the interacting with the Cloud Storage 
API.
  service = discovery.build('storage', _API_VERSION, http=http)

  try:
    req = service.buckets().get(bucket=_BUCKET_NAME)
    resp = req.execute()
    print1= json.dumps(resp, indent=2)
   

    fields_to_return = 
'nextPageToken,items(name,size,contentType,metadata(my-key))'
    req = service.objects().list(bucket=_BUCKET_NAME, 
fields=fields_to_return)
    # If you have too many items to list in one request, list_next() will
    # automatically handle paging with the pageToken.
    while req is not None:
      resp = req.execute()
      print2= json.dumps(resp, indent=2)
      req = service.objects().list_next(req, resp)

  except client.AccessTokenRefreshError:
    aviso= "The credentials have been revoked or expired, please re-run the 
application to re-authorize"
    
  form=SQLFORM(db.gfile)
  return dict(print1=print1,print2=print2, form=form)

 I get the result expected.

But when I deploy it to the google app engine, there rises an error ticket: 

14:49:26.005
 Unable to store in FILE: 
/base/data/home/apps/s~merebafs/2.381697639759293929/applications/MRBFILE/controllers/default.py
 
Traceback (most recent call last): File 
"/base/data/home/apps/s~merebafs/2.381697639759293929/gluon/restricted.py", 
line 224 
<https://console.developers.google.com/project/merebafs/clouddev/source/resolve_location?appModule=default&appVersion=2&timestampNanos=1421934566005000000&file=%2Fbase%2Fdata%2Fhome%2Fapps%2Fs~merebafs%2F2.381697639759293929%2Fgluon%2Frestricted.py&line=224>,
 
in restricted exec ccode in environment File 
"/base/data/home/apps/s~merebafs/2.381697639759293929/applications/MRBFILE/controllers/default.py",
 
line 12 
<https://console.developers.google.com/project/merebafs/clouddev/source/resolve_location?appModule=default&appVersion=2&timestampNanos=1421934566005000000&file=%2Fbase%2Fdata%2Fhome%2Fapps%2Fs~merebafs%2F2.381697639759293929%2Fapplications%2FMRBFILE%2Fcontrollers%2Fdefault.py&line=12>,
 
in <module> import httplib2 File 
"/base/data/home/apps/s~merebafs/2.381697639759293929/gluon/custom_import.py", 
line 86 
<https://console.developers.google.com/project/merebafs/clouddev/source/resolve_location?appModule=default&appVersion=2&timestampNanos=1421934566005000000&file=%2Fbase%2Fdata%2Fhome%2Fapps%2Fs~merebafs%2F2.381697639759293929%2Fgluon%2Fcustom_import.py&line=86>,
 
in custom_importer raise ImportError, 'Cannot import module %s' % str(e) 
ImportError: Cannot import module 'httplib2' 


And I can't understand because it is a library they use in their example. 
Any idea...

The next thing is to achieve functions to upload and download files from 
the bucket I accessed to. 

Thanks

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to