Hey, > Did I miss something? Do you guys encounter same problem when playing > with GAE? Does your application successfully insert hundreds of > records during one request? Any feedback will be highly appreciated!
On GAE, writes are very expensive because each record is actually written to disk, indexes are updated and written to disk and all this happens synchronously, so it takes a long time to serially write records. For GAE, take a look at batch get,put,delete which let you do operations in parallel as opposed to serially. These batch operations will be added to GQLDB eventually. Either way, to prepopulate a db on GAE, you have to do it over a series of requests, because each request can only be 10 seconds max. See bulk uploader: http://code.google.com/appengine/articles/bulkload.html Robin On Jan 27, 10:13 am, Iceberg <iceb...@21cn.com> wrote: > Hi pals, > > Don't know whether this is a web2py problem or a GAE problem, anyway I > still post it here and hope to get some kindly hints. > > I started my GAE experience just today. I managed to upload web2py > 1.55.2 and its welcome application and later one of my homebrew "no-db- > required" application, all of them work fine. Then I tried to port my > another "serious" application to GAE. The application requires some > initial data (20000+ records, about 500KB total) before it can do some > real job. > > In my local developing environment, I can easily setup an SQLite db, > then visit something like "/myapp/default/putDataIntoSQLDB" to trigger > the data initialization without any problem. > > But when talking about GAE, I guess there is no way to supply these > data in a predefined storage.db file, probably because GAE doesn't > support file operation, am I correct? In fact, I tried defining > "db=SQLDB('sqlite://storage.db')" in my db.py but it caused an error > in GAE: > NameError: global name 'sqlite3' is not defined > > So I used "db=GQLDB()" instead. Then the applications can start, BUT, > when I visit "/myapp/default/putDataIntoGQLDB", my application ALWAYS > results in "DeadlineExceededError" after about 8 seconds of execution, > during this time only several dozens of records are inserted into > GQLDB. If this is because I called db.mytable.insert(...) for too many > times, may I have some other way to do batch insert()? > > By the way, later when I tried to reset the faulty db data, I found > out that there are more than 50% chances that GAE complains: > Timeout: datastore timeout: operation took too long. > even when I simply invoke: > db(db.mytable.id>0).delete() > > I search the web and found some rumors, such as: > http://highscalability.com/google-appengine-second-look > But I still can't believe the performance is as bad as what I > mentioned above. > > Did I miss something? Do you guys encounter same problem when playing > with GAE? Does your application successfully insert hundreds of > records during one request? Any feedback will be highly appreciated! > > Sincerely, > Iceberg --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "web2py Web Framework" group. To post to this group, send email to web2py@googlegroups.com To unsubscribe from this group, send email to web2py+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/web2py?hl=en -~----------~----~----~----~------~----~------~--~---