Hi Simon,

Thanks for your thoughts.

> How do you intend the output of your python script to end up accessible
> for consumption by GitLab pipelines?  A simple approach is to run it
> locally and commit all the generated files to git.  Is that what you had
> in mind?

Yes, that's what I intend to do.

> Another idea is to run the script during the pipeline, and (at least for
> GitLab) use a sub-pipeline that takes the generated script code and runs
> that as a pipelin.  This avoids commiting generated files.  It adds a
> bit of complexity and debugging, but can work fine.

Well, the problem is with debugging and transparency. When the output of
a generator is being immediately used, especially in a CI situation where
we can't work interactively, the turnaround cycle
  modify config — rerun pipeline — fetch generated log from pipeline
is more tedious than
  modify config — run generator locally — look at generated file locally

> I'm not that familiar with GitHub actions, but I assume most of the same
> concepts are the same...

There's no usable 'include' feature on the GitHub side, AFAIK.

> Also I think you'll also quickly discover that different releases need
> use different package names

Yep. Package names are not the same e.g. between CentOS 7 and AlmaLinux 9.

Bruno




Reply via email to