> It's a multifaceted issue it's becoming a bottleneck to support the growing
> list of committers.

Please list the facets. Is the issue speed of CI or speed on local builds?

> The build has issues like python2, python 3, c++ being built serially in
> containers separately before being copied to a destination container.
> The integration tests need an hour to build an image even before they start
> running.

Moving to multiple repositories won't help this. Maybe bazel would.
For CI, a better solution would be to spawn multiple CI jobs, one to
generate each of py2, one for py3, one to generate broker tarball,
etc, and then pull them together in a single image.
For local, the issue is that you are using a mac, which is why these
builds are so super slow. C++, py2 & py3 builds have to run in a
docker container, which means running on a docker host on a VM, which
is slow as hell, especially for C++. Some cross compilation would
certainly help for this.
Another issue here is that the py2/py3/c++ is only needed for the py
functions runner. It should be possible to separate out the runners
from the main image.

> Moving the docs to a separate repo gives us some short term wins and is a
> good start. It will unblock committers just concerned with doc updates and
> we can enable a near instantaneous update to the website for preview.

This could equally be done by making the CI job a noop if nothing
changes outside of the site2 directory.

-Ivan

Reply via email to