Hi Uwe,

> On Jan 7, 2016, at 5:20 PM, Uwe Schindler <[email protected]> wrote:
> 
> No, we should delete the deleted Job's workspaces. If you delete a job 
> without first deleting the workspace it is kept on disk (which is a bug in 
> older Jenkins versions as used by ASF). So you have to manually delete it.
> 
>> -----Original Message-----
>> From: Steve Rowe [mailto:[email protected]]
>> Sent: Thursday, January 07, 2016 7:23 PM
>> 
>> Looks like ASF Jenkins's lucene slave is running out of disk space (again):
>> 
>>> [smoker]    [junit4] java.io.IOException: No space left on device
>> 
>> Uwe, opinions on what to do?  Should we ask Infra for more disk space
>> (again)?

I’m guessing you must have done the manual deletions?  I don’t see any orphaned 
workspaces on disk, and available space looks adequate:

——
sarowe@lucene1-us-west:~$ df /home/jenkins/jenkins-slave/workspace
Filesystem     1K-blocks     Used Available Use% Mounted on
/dev/sdb1       82437808 56211492  22015680  72% /x1
——

Disk footprints for each job’s workspace:

——
sarowe@lucene1-us-west:~$ sudo -u jenkins du -sk 
/home/jenkins/jenkins-slave/workspace/*
121436  /home/jenkins/jenkins-slave/workspace/Apache Jackrabbit Oak matrix
973492  /home/jenkins/jenkins-slave/workspace/Lucene-Artifacts-5.3
984084  /home/jenkins/jenkins-slave/workspace/Lucene-Artifacts-5.4
980732  /home/jenkins/jenkins-slave/workspace/Lucene-Artifacts-5.x
963896  /home/jenkins/jenkins-slave/workspace/Lucene-Artifacts-trunk
6527688 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Clover-5.x
6131460 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Clover-trunk
1523640 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-5.3
1641880 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-5.4
1644900 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-5.x
1651268 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Maven-trunk
2654308 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.3
4435392 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.4
2370868 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-5.x
3191700 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-NightlyTests-trunk
2988924 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.3
1917096 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.4
1917780 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-5.x
1919780 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-SmokeRelease-trunk
1017100 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-5.3-Java7
1040176 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-5.4-Java7
1047236 /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-5.x-Java7
988956  /home/jenkins/jenkins-slave/workspace/Lucene-Solr-Tests-trunk-Java8
1618928 /home/jenkins/jenkins-slave/workspace/Solr-Artifacts-5.3
1638788 /home/jenkins/jenkins-slave/workspace/Solr-Artifacts-5.4
1641804 /home/jenkins/jenkins-slave/workspace/Solr-Artifacts-5.x
1649436 /home/jenkins/jenkins-slave/workspace/Solr-Artifacts-trunk
——

Totals by branch:

——
16496496        Trunk
16131008        5.x
11657416        5.4
10776392        5.3
55182748        Total
——

So it looks like as long as we keep up with pruning inactive jobs and orphaned 
workspaces, we can safely accomodate two (or three?) active development 
branches and two (or three?) active release branches with currently available 
disk space.

We’ll have to keep an eye on it when branch_6x and associated Jenkins jobs are 
created.

Steve


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to