The thing is that when you read from datastore you always read all 
properties of an entity.
Each entity can take up to 1MB, so I can imagine reading 500 of them could 
hit the instance
limit. The solution would be to use GAE cursor and cycle your task. In 
pseudo code:

def request_handler():
  max_size = 400

  # start from the position stored in a cursor
  if (request.cursor):
    entities = query(max_size, cursor)
  else:
    entities = query(max_size)

  process(entities)

  # if not all processing done
  if entities.size() == max_size:
    # read cursor
    cursor = entities.get_cursor()
    # start new task, pass cursor as parameter
    queue.add("/request_handler", {"cursor":cursor})

I don't think DAL has a GAE cursor support implemented so the disadvantage 
of this approach
is that you would have to use the GAE API directly.

Reply via email to