I understand how the fingerprinting process works and I use it to mark our jars, wars, ears, and zips.
However, I was thinking of fingerprinting every file in a zip for every build. This way, we can trace back each file when the archive is unzipped and its contents scattered. That could mean finger printing hundreds or thousands of files in each build. I suspect that may add a few minutes to the build, but how will tracking all those finger prints affect Jenkins performance? -- David Weintraub da...@weintraub.name On Nov 8, 2012, at 3:09 AM, AdvanTiSS <advant...@gmail.com> wrote: > Fingerprinting process based on md5 checksum calculation using > java.security.DigestInputStream on each file targeted for fingerprinting. > You can read some information about md5 algorithm performance here - [Secure > hash functions in Java]. > > On Wednesday, November 7, 2012 3:44:11 PM UTC+2, qazwart wrote: >> >> How resource intensive is fingerprinting? What if I fingerprint all files >> that I build? >> >> We deploy a lot of zipped archives instead of jars and wars in our JBoss >> instant. This way, we can generate various client configurations. However, >> it also means that the build assets can get moved around quite a bit, and >> I'd like someway of determining what build that file was associated with. >> Right now, I'm just fingerprinting the zipped archive, but it may be better >> if I fingerprinted all the files inside the archive before it is zipped. >> >> I can't imagine fingerprinting taking up a lot of resources, on a per file >> basis, but if I am fingerprinting hundreds of files per build, I can imagine >> it being a problem. >> >> What is your policy on fingerprinting files?