On Tue, 5 Dec 2006, Joe Advisor wrote:

> Hi all,
> 
> If I rapidly rewrite a file, for example:
> 
>  while true; do echo "foo" > /foo; done;
> 
> Or for example:
> 
> #!/usr/bin/perl
> for (1 .. 100000) {
>    MyStuff::Util::writeFile('/root/foo', $blah);
> }
> 
> The filesystem eventually says filesystem full.
> 
> Obviously those are corner cases because I am rapidly rewriting.  But even if 
> I don't rapidly rewrite, if I just rewrite for example, once every few 
> seconds, based on changes to the environment, etc., the filesystem still 
> fills up.  If I make the filesystems bigger, that helps, but I was wondering 
> if there is another way.
> 
> If I put a sync in cron, that helps a lot too.  That seemed like a kludge, 
> wasn't sure if that's the right thing to do.
> 
> Is there a global setting, perhaps some sysctl, that I need to modify, to 
> prevent this from happening?
> 
> Thanks in advance.

With this loop:

        while true; do echo "foo" > foo; sleep 1; done; 

I see increasing usage, but it drops after a while. There's no
permanent increase.

It could be softdep processing (are you using softdep?) is not capable
of keeping up. You could try without softdeps. 

        -Otto

Reply via email to