**************
So you add all these repositories to your jobs and then they are run each time 
one of those repositories is updated, right?

Well, I either have unit tests build every night more into scheduled build into 
Jenkins pipeline options or manually for the distribution build they are done 
manually with a tag number injected by the user.
I do this because I don't have enough PC performance to run and build on every 
commit. But in an ideal world I would put a webhook on my repos to trig the 
Jenkins build when a push is done into the right branch. The Jenkins common 
library are always taking the head of the master branch, this should always 
work and have the most recent functions and bug fix (I do not break backward 
compatibility and if I do, I update all the Jenkinsfile right away). The setup 
and deploy is not large enough over here to bother just yet, but you could 
easily use branch to prevent backward compatibility issues with the common 
library.

**************
How do things work on slaves? Is each repos cloned in its own directory in the 
workspace directory?

The master scheckout the pipelines repos to fetch the Jenkinsfile from SCM. 
That repos only contain that file (or many of them) along with build specific 
groovy/shell... scripts to help the build process. The first thing it does on 
the slave node is to checkout the common tools and import the needed one inside 
a subfolder (Amotus_Jenkins/).

Once the tools are loaded, I do checkout the source and start the build as it 
normally should. I use ENV var to set the branch or other options that will be 
used to build that repos. Those env are injected by Jenkins parameters options.

The resulting artifacts are sent to artifactory/appcenter/ftp... and test 
results are analyze right into Jenkins.

That way, the only thing Jenkins known are credentials to repos and the 
pipeline repos and parameters ask to user. The rest is done inside the pipeline 
repos (unit test/build/deploy jenkinsfile). The source repos doesn't even known 
if any CI exist on him, so If I want to build an old version with a recent CI 
or change my CI it will still work.

The checkout syntax into Jenkinsfile is a bit painful if you have git 
submodules, but it will track the changes with the right plugins. My first few 
stage mostly look like this, it's a shame that nearly all my jenkinsfile file 
look like this but it straight forward once you see it once:

node("PHP && PHPComposer") {
    def amotusModules = [:];  // this dictionary hold the tools modules files I 
made generic for all projects
    def amotusRepos = [
        [
              name: 'Repos Name'
            , url: 'https://bitbucket.org/repos2.git'
            , branch: "${params.AMOTUS_BRANCH}"
            , path: 'SourceRepos2'
        ]
    ];

stage('Checkout Tools') {
        dir("Amotus_Jenkins") {
            checkout([$class: 'GitSCM'
                , branches: [[name: 'master']]
                , browser: [$class: 'BitbucketWeb', repoUrl: 
'https://bitbucket.org/repos.git']
                , doGenerateSubmoduleConfigurations: false
                , extensions: [[$class: 'SubmoduleOption', disableSubmodules: 
false, parentCredentials: true, recursiveSubmodules: true, reference: '', 
trackingSubmodules: false], [$class: 'CleanCheckout']]
                , submoduleCfg: []
                , userRemoteConfigs: [[credentialsId: 'BitBucketAmotus', url: 
'https://bitbucket.org/repo.git']]
            ]);
        }
    }
    
    stage('Load option') {
        dir(pwd() + "/Amotus_Jenkins/"){
            // Load basic file first, then it will load all others options with 
their dependencies
            load('JenkinsBasic.Groovy').InitModules(amotusModules);
            amotusModules['basic'].LoadFiles([
                  'JenkinsPhp.Groovy'
                , 'JenkinsPhpComposer.Groovy'
            ]);
        }
    }
    
    stage('Checkout Repos') {
        amotusRepos.each { repos ->
            dir(repos['path']) {
                checkout([$class: 'GitSCM'
                    , branches: [[name: 
amotusModules['basic'].ValueOrDefault(repos['branch'], 'master')]]
                    , browser: [$class: 'BitbucketWeb', repoUrl: repos['url']]
                    , doGenerateSubmoduleConfigurations: false
                    , extensions: [[$class: 'SubmoduleOption', 
disableSubmodules: false, parentCredentials: true, recursiveSubmodules: true, 
reference: '', trackingSubmodules: false], [$class: 'CleanCheckout']]
                    , submoduleCfg: []
                    , userRemoteConfigs: [[credentialsId: 'BitBucketAmotus', 
url: repos['url']]]
                ]);
            }
        }
    }

   // Perform the build/test stages from here
}

That give a good idea on how things are executed on the slave node. I also use 
node env var to override default path for tools if they not install into the 
default path or the OS have a special path (I'm looking at you MacOS).

There is still quiet some room for improvement, but I got so little time 
allowed for my DevOps...

-----Original Message-----
From: jenkinsci-users@googlegroups.com <jenkinsci-users@googlegroups.com> On 
Behalf Of Sébastien Hinderer
Sent: August 14, 2020 11:29 AM
To: jenkinsci-users@googlegroups.com
Subject: Re: Pipeline design question

Hello Jérôme, thanks a lot for your response.

Jérôme Godbout (2020/08/11 16:00 +0000):
> Hi,
> this is my point of view only,but using a single script (that you put 
> into your repos make it easier to perform the build, I put my pipeline 
> script into a separated folder). But you need to make sure your script 
> is verbose enough to see where it has fail if anything goes wrong, 
> sinlent and without output long script will be hard to understand 
> where it has an issue with it.

Indeed. Generally speaking, we activate the e and x shell options to have 
command displayed and scripts stop on the first error.

[...]

> I for one, use 3+ repos. 
> 1- The source code repos
> 2- The pipeline and build script repos (this can evolve aside form the 
> source, so my build method can change and be applied to older source 
> version, I use branch/tag when backward compatibility is broken or a 
> specific version is needed for a particualr source branch)
> 3- My common groovy, scripts tooling between my repos
> 4- (optional) my unit tests are aside and can be run on multiple 
> versions

That's a very interesting workflow, thanks!

So you add all these repositories to your jobs and then they are run each time 
one of those repositories is updated, right?

How do things work on slaves? Is each repos cloned in its own directory in the 
workspace directory?

> Hope this can help you decide or plan you build architecture.

It helps a lot! Thanks!

Sébastien.

--
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/20200814152915.GA143147%40om.localdomain.

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/YTOPR0101MB23157117DBE7045F7250A18ECD400%40YTOPR0101MB2315.CANPRD01.PROD.OUTLOOK.COM.

Reply via email to