That is brilliant!  I wish I'd thought of that before, since that will likely 
save me lots of disc space on our slave nodes.
 
Mark Waite


>________________________________
> From: Sami Tikka <sjti...@gmail.com>
>To: "jenkinsci-users@googlegroups.com" <jenkinsci-users@googlegroups.com> 
>Sent: Monday, February 20, 2012 3:51 PM
>Subject: Re: git: reduce clones' disk space
>  
>
>You can already achieve the same benefit by making a local clone of the git 
>repo (use --bare for this) and then configuring each job to have 2 repos: the 
>first should be /path/to/local/repo and the second can be the location where 
>you usually clone from.
>
>
>This way most git objects will be shared because a local git clone will use 
>hard links. 
>
>
>My build slaves at work have small but fast ssd disks and we use this trick 
>(plus running git clean -fxd as a post-task step) to keep disk space usage in 
>control. 
>
>-- Sami
>
>Gergely Nagy <gsz...@gmail.com> kirjoitti 15.2.2012 kello 19.15:
>
>
>Thanks Mark, 
>>that's great info - to me it sounds like the way to go.
>>Gergo
>>
>>
>>On Wed, Feb 15, 2012 at 3:03 AM, Mark Waite <markwa...@yahoo.com> wrote:
>>
>>The git plugin rework discussions mentioned the possibility of including the 
>>"---reference <existing-repository>" argument to git clone so the pack files 
>>for a single repository could be reused in multiple repositories on the same 
>>machine.  Then you could clone to a single directory on the slave, and 
>>reference that clone rather than copying the pack files to each of the 
>>workspace copies. 
>>> 
>>>I don't think it has been implemented yet, but the plugin developers may be 
>>>willing to share their ideas in case they have an even better idea than 
>>>using the --reference argument to git clone. 
>>> 
>>>Mark Waite
>>>
>>>
>>> From: Gergely Nagy <gsz...@gmail.com>
>>>>To: jenkinsci-users@googlegroups.com 
>>>>Sent: Tuesday, February 14, 2012 1:23 PM
>>>>Subject: git: reduce clones' disk space
>>>> 
>>>>
>>>>
>>>>Hi Jenkins gurus, 
>>>>
>>>>
>>>>I have a load of jobs (50+ I think) which clone the same repository, but 
>>>>different branches,  to build/unit/test/functional test stages.
>>>>
>>>>
>>>>Also, it's a special application of the "job splitting pattern" 
>>>>(https://wiki.jenkins-ci.org/display/JENKINS/Splitting+a+big+job+into+smaller+jobs):
>>>> 
>>>>the tarball that downstream jobs receive is a much smaller than the entire 
>>>>workspace: it only contains unknown files(git ls-files -oz: the build 
>>>>artifacts), which is "just" 400m  
>>>>vs 1.8G. Downstream jobs unpack this on top of a pristine clone to get up 
>>>>to speed. This is quite fast (most files are there already) and also seems 
>>>>to do better change tracking.
>>>>
>>>>
>>>>However it costs space - each of the workspace is ~ 4-5G - half of which is 
>>>>the git clone.  
>>>>While git has a good reason to clone everything with all the branches, I 
>>>>don't need that duplicated 50 times on the Jenkins box.
>>>>So am wondering if there is a way to optimise this? 
>>>>I guess, i'd rather have one single full clone, and let jobs have the work 
>>>>directories (+index?)..  
>>>>
>>>>
>>>>Any enlightments/alternative ideas are appreciated.
>>>>thanks,
>>>>Gergo
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> 
>>>>
>>>>   
>> 
>
>   

Reply via email to