Simon Richter (2015-04-25 02:02):
Hi,
I have a project that outputs a few large files (compiled DLL and static
library) as well as a few hundred header files as artifacts for use by
the next project in the dependency chain. Copying these in and out of
workspaces takes quite a long time, and the network link is not even
near capacity, so presumably handling of multiple small files is not
really efficient.
Can this be optimized somehow, e.g. by packing and unpacking the files
for transfer? Manual inspection of artifacts is secondary, I think.
If some of the files remain unchanged then it can be done more
efficently when you NOT pack the files. You could for example create a
respository (SVN) for artifacts and instead of copying all files you
would simple run `svn update` and get only changed files. Another option
would be using rsync for synchronisation but that might not work as good
as SVN would.
Regards,
Nux.
--
You received this message because you are subscribed to the Google Groups "Jenkins
Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/jenkinsci-users/553DF5D5.9040304%40mol.com.pl.
For more options, visit https://groups.google.com/d/optout.