I'm considering the possibility of using the datastore's allocate_ids() as a
generic mechanism to maintain persistent counters. I figure this will
probably be a dumb idea, but thought I'd pose the question anyway. Here is
some code -
keys = db.allocate_ids(db.Key.from_path('SomeCounter', 1), 1)
logging.info("keys: %s" % str(keys))
keys = db.allocate_ids(db.Key.from_path('SomeOtherCounter', 1), 1)
logging.info("keys: %s" % str(keys))
keys = db.allocate_ids(db.Key.from_path('_DchdHdP-G7', 1), 1)
logging.info("keys: %s" % str(keys))
this gives the following log output in the SDK:
INFO 2011-01-29 15:06:04,218 main.py:35] keys: (1, 1)
INFO 2011-01-29 15:06:04,219 main.py:37] keys: (2, 2)
INFO 2011-01-29 15:06:04,219 main.py:39] keys: (3, 3)
looking at the datastore viewer in the admin console, there are no model
entities corresponding to these 'kinds', which is expected. Some questions -
1. Shouldn't I be getting a value of (1, 1) back for each of those
allocations, because they are different 'kinds'?
2. If I should, why is this mechanism a bad idea for managing arbitrary
counters?
3. Is there any way to reset the counter for a kind? Or to set it to an
arbitrary value? Obviously doing so would result in key collisions if I was
using it normally, but the datastore knows how to handle that error state.
Cheers,
Colin
--
You received this message because you are subscribed to the Google Groups
"Google App Engine" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/google-appengine?hl=en.