just to add - for now I am using a sort of a compromising in the form of an init method that initializes these properties. as long as I only call this method only once at the beginning of the pipeline, everything works great. this might be the only way to do this. I would still rather find a way to do it without the init method, or at least making sure the init method would never redefine already defined properties, just on the safe side.
On Thursday, June 2, 2016 at 4:29:37 PM UTC+3, Itai Sanders wrote: > > Hi, > > I am looking for some help regarding the Pipeline Global Library > <https://github.com/jenkinsci/workflow-cps-global-lib-plugin/blob/master/README.md> > > functionality. > > it seems like there is no way (or at least no straight way) to create a > script that will hold both predefined properties and pipeline methods. I > need to do this in order to write a wrapper to some Artifactory > <https://github.com/JFrogDev/project-examples/tree/master/jenkins-pipeline-examples> > > functionality to be used in multiple pipelines. I need the Artifactory api > to be placed inside the global library so I can call it with simple calls, > yet those methods need to interact with a single-per-build Artifactory > object instance which holds the Artifactory identity and a global build > info object to aggregate the published artifacts info into. > > the naive implementation was something like this: > >> server = Artifactory.server('artifactory-server-id') > > buildInfo = Artifactory.newBuildInfo() > > >> def upload(file, target) { > > def uploadSpec = "{....$file,.....$target......}" >> >> server.upload(uploadSpec, buildInfo) >> >> } > > > > def publisInfo() { > > server.publishBuildInfo(buildInfo) >> >> } > > > I managed to implement it fairly easily using the fileLoader plugin, but > pulling from the external git seems to be problematic from various reasons. > I thought implementing it as a Global Library would be fairly easy, but is > seems to be much more pain that I thought. > the script itself is indeed recognized as the artifactory variable (the > script is called artifactory.groovy and placed in the vars folder), but > when I call the artifactory.upload method I get the following error: > >> No such property: server for class: groovy.lang.Binding >> >> which suggests that the server property wasn't defined as I expected (and > as happens in the fileLoader method of running things). > > I then tried to create some sort of functionality that will define these > properties only if they are not yet defined. something like this.server = > this.server ?: Artifactory.server('artifactory-server-id') or calling > on binding.variables.containsKey("server") inside an if statement, but this > kind of logic seems to throw the entire pipeline into an infinite recursion. > > > I will appreciate any idea, workaround or alternative way to achieve the > same effect. calling Artifactory.server('artifactory-server-id') in each > method is an option, but I can't call Artifactory.newBuildInfo() inside a > method since the object needs be used on all upload calls. > right now it seems like the only way is to keep those in the actual > pipeline script, which ruins much of the idea of keeping all Artifactory > management inside artifactory.groovy > > Itai > > > -- You received this message because you are subscribed to the Google Groups "Jenkins Users" group. To unsubscribe from this group and stop receiving emails from it, send an email to jenkinsci-users+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/jenkinsci-users/768c93d6-4aeb-41db-818f-4b5e690c5080%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.