Is anyone very much in love with the --memlimit (default: 3300MB)
option to the sage -t command?

Once again it has completely broken testing on some systems. We could
try to guess a new, higher limit... or just admit that maybe it's not a
great idea after all and delete the thing. The latter is what I'm about
to propose on https://trac.sagemath.org/ticket/31395

A few points:

  * The failure of the default limit has been and will always be a 
    recurring problem as sage requires more and more memory. Every
    time we hit it, a bunch of machines are broken until the limit
    can be raised in the "develop" branch.

  * The default memory limit exists for precisely one doctest (which 
    I've refactored to still be tested, modulo the next bullet point).

  * Testing out-of-memory conditions doesn't test what you'd expect, 
    since if you _actually_ run out of memory, all hell breaks loose
    on the system at the same time as your graceful error handling 
    kicks in.

  * A global limit is likely incorrect, as would be revealed if there 
    were more than one doctest using it. Each test needs the limit to
    be low enough to trigger a failure, but not low enough to crash
    the rest of sage. Both of these numbers are test- and system-
    dependent.

  * Reimplementing "ulimit -v" in a mathematics suite is a waste of
    development resources.

I'd rather just delete it and generalize the one existing doctest with
something like a "with memlimit(...)" context manager in the unlikely
event that we ever have another test for OOM behavior.


-- 
You received this message because you are subscribed to the Google Groups 
"sage-devel" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-devel+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/sage-devel/fb887a37abdddeb77e05d433983b3d1ae8dca149.camel%40orlitzky.com.

Reply via email to