I have a weird dictionary key issue: I have an 'Article' model, with textfields for Year and Country and foreignkeys for Magazine and Author. I generate a frequency list for each criteria (e.g.: Year=2005) which I use to keep the count. To do this, I subclassed dictionary like so:
class dictinc(dict): """dictionary with a 'set or increment' method""" def inc(self,key): self[key]=self.get(key,0)+1 so I create yeardict=dictinc() and if I want to add 1 to the count for Year=2005, I use: yeardict.inc(2005) This works fine whenever I use text or integers as keys. However, if I try to use a model instance (e.g.:a Magazine instance), it works fine on my development (Vista) machine, but not on my production machine (Linux, on webfaction). Locally, if I have two articles that reference "magazine1", and call for each one: magazine = article.magazine magazinedict.inc(magazine) magazinedict becomes {<Magazine: magazine1>: 2} however, on the production server, magazinedict is: {<Magazine: magazine1>: 1, <Magazine: magazine1>: 1} If I try to check whether the keys are equal: a,b = magazinedict.keys() a == b True So, I guess the problem is using a model instance as a dictionary key. What's weird is the development/production server discrepancy. Development is running 0.97, production 0.96, which I guess might account for the difference, no? Any ideas on how to fix this? --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/django-users?hl=en -~----------~----~----~----~------~----~------~--~---