This looks painful to me for a few reasons: 1. Hitting the db on everything is pricey. Building this with the foresight that a read-through cache will eventually be needed might be a good idea. 2. Some of the SqlAlchemy generation looks to be likely expensive (within Python). It would probably make sense to do this in a way that takes advantage of the newer 'baked' queries. (http://docs.sqlalchemy.org/en/latest/orm/extensions/baked.html) 3. If you're locking down to Postgres for the JSONB support... you might as well develop this as postgres function and python code to interact with it.
-- You received this message because you are subscribed to the Google Groups "pylons-discuss" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at http://groups.google.com/group/pylons-discuss. For more options, visit https://groups.google.com/d/optout.
