I haven't used that code in a long time as my tables are too big. :(

what's the traceback that you get?

On 6/19/13 7:23 , peibol wrote:
Bigtable. I don't have any other models but the standard auth ones and the
builtin wiki ones and I'm using this code (just the one stated in the book):

def import_and_sync():
     form = FORM(INPUT(_type='file', _name='data'), INPUT(_type='submit'))
     if form.process().accepted:
         db.import_from_csv_file(form.vars.data.file,unique=False)
         # for every table
         for table in db.tables:
             # for every uuid, delete all but the latest
             items = db(db[table]).select(db[table].id,
                        db[table].uuid,
                        orderby=db[table].modified_on,
                        groupby=db[table].uuid)
             for item in items:

db((db[table].uuid==item.uuid)&(db[table].id!=item.id)).delete()
     return dict(form=form)


def export():
     s = cStringIO.StringIO()
     db.export_to_csv_file(s)
     response.headers['Content-Type'] = 'text/csv'
     return s.getvalue()





El miércoles, 19 de junio de 2013 15:14:57 UTC+2, Christian Foster Howes
escribió:

Are you using BigTable or Google Cloud SQL for data storage?  i'm
surprised that import to BigTable would give an integrity error.

note that if you are import/export as a controller you will be limited
by what you can do in 128MB of ram and 60 seconds of processing unless
you use larger instance classes and/or backend instances.

cfh

On 6/19/13 4:05 , peibol wrote:
Thanks Cristian. What I'm considering is use a export/import function in
the app, only visible to the administrator. Because I want to develop a
kind of wiki, with its content, on local. So I'll use export_to_csv_file
and import_from_csv_file.

The export process is working for me right now, but the import process
gives an integrity error.

Reading the book, it must be some issue with the uuids...



El mi�rcoles, 19 de junio de 2013 07:28:02 UTC+2, Christian Foster
Howes
escribi�:

   i wouldn't copy data personally, i consider localhost a test
environment,
and GAE proper production and i just make my production data there.

if you do want to copy data look at the GAE bulk loader:
https://developers.google.com/appengine/docs/python/tools/uploadingdata




--

--- You received this message because you are subscribed to the Google Groups "web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to