Option A2: Make the models a function placed inside of a module that lives on applications/my_lib_app/modules/my_stuff.py, then in each of your apps you write local_import('my_stuff', app='my_lib_app')
It does not require extra installation on the production machines, you just need to install three (instead your original two) normal web2py apps. I know this breaks the normalcy that "app should be atomic therefore should not rely on each other". But hey, it is your choice to make you feel comfortable. In fact I have my_lib_app for a long time. Shhhh, don't tell that to Massimo. ;) Regards, Iceberg On Apr30, 1:52am, Thadeus Burgess <thade...@thadeusb.com> wrote: > I know this has come up previously, however there is still no good > solution to this problem. > > I have two apps that I want them to share a couple of the same > database models. In import-based frameworks this is easy, you just > import the database models like any other python package. > > So far these are the potential work-arounds to the issue. > > * Make the models a function placed inside of a module that lives on > the sys.path. You must pass a db object to this function and it can > create the db models. > * Separate the shared models and place them in their own model/*.py. > Symlink this with the other app that shares the model. > * Combine both apps together, but use routes hacks and other boolean > flags to determine what models get executed. > > The issue is not of is it possible, but what is maintainable. > > Option C is a nightmare to maintain each of the little boolean flags > that determine what should get executed. > > Option B is a disaster waiting to happen on the production machines. > > Option A seems to be the best, but requires extra installation on the > production machines, since this module now becomes a "dependency" of > the web2py apps. > > Which would you use? Have any other suggestions? > > -- > Thadeus