No silver bullets exist (yet). A couple of things I can recommend:

1. GitLab's experience of changing schema without downtime and maintaining
backward compatibility – they have open documentation and a lot of things
solved and documented
    - start here:
https://docs.gitlab.com/ee/development/migration_style_guide.html
    - check their migration_helpers.rb, plenty of useful things there

2. What my team and I are doing with respect to database migration testing:
https://postgres.ai/. We created Database Lab Engine, an open-source tool
to clone databases of any size in seconds and test, manually or
automatically in CI, anything you want using "full-size" databases. It can
help you catch and block dangerous changes leading to downtime, as well as
(if you have a well-maintained set of tests for CI) enforce the backward
compability.

Nik

On Sat, May 22, 2021 at 2:12 PM Zahir Lalani <ZahirLalani@oliver.agency>
wrote:

> Confidential
>
> Hello All
>
>
>
> I wonder if I could garner some of the wealth of experience on this group:
>
>
>
> Our current application deployments (every 3 weeks) require about 30min
> downtime. We are now tasked of making this 0 downtime.
>
> From all the reading I have done, we have solutions for the infrastructure
> and code deploy, but with regards to the DB the main issue seems to be
> keeping the new deploy backwards compatible – functions/tables/fields – all
> of it.
>
>
>
> That seems like quite a large management task and would require careful
> reviews of changes. Is there any type of framework that already manages
> this type of capability? Or are there aspects of PG that we should be using
> in this regard?
>
>
>
> Thx
>
>
>
> Z
>

Reply via email to