I have a web2py application that uses PostgreSQL and is running on 3 
servers. We'll call them QA, Stage and Prod. Whenever I make some schema 
changes in my application, my deployment pipeline stops working. The 
deployment pipeline works this way; updated code is pushed to the QA 
server, this server has an empty database and no web2py database files; so 
the updated schema is built on this server. If all works well on this 
server, a script pushes the code to the Stage server. This server gets its 
database (physical tables and web2py database files) from the Prod server. 
When I run my application (on the Stage server) it crashes, since the code 
and database are on different states. The database brought in from prod 
does not have the new tables or columns that are now in my code. Is there a 
way to fix this process through automation? I have been successful in 
fixing this manually by importing the physical database from Prod in an 
empty database, deleting the database files from Stage, setting 
fake_migrate to True, and then running the application. I need a more solid 
approach that will allow me to cope with these schema changes (however 
drastic they may be) through automation, on the Prod server.
Thanks.

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to