Setting ulimits is generally a good idea, because excessive memory use is usually a sign of a bug, and it is better if the program crashes before it forces the system to thrash its swapspace (especially on multi-user machines)
One can set "soft" ulimits, which are generally the right thing: The user can increase them if he really needs to, but is otherwise protected. Indeed, setting the virtual memory ulimit too low will cause sage to fail starting up. Apparently, 10^5 kib is too small for sage, which is fine: sh-3.2$ ulimit -S -v 100000 sh-3.2$ ulimit -v 100000 sh-3.2$ sage ---------------------------------------------------------------------- | Sage Version 4.4.4, Release Date: 2010-06-23 | | Type notebook() for the GUI, and license() for information. | ---------------------------------------------------------------------- [...] ImportError: /usr/local/sage/4.4.4/local/lib/python2.6/lib-dynload/ fcntl.so: failed to map segment from shared object: Cannot allocate memory However, if we set a more reasonable bound (10^6 kib, roughly a gig), sage does start up without problem, but as you can see, it wipes the soft virtual memory limit! sh-3.2$ ulimit -S -v 1000000 sh-3.2$ ulimit -v 1000000 sh-3.2$ sage ---------------------------------------------------------------------- | Sage Version 4.4.4, Release Date: 2010-06-23 | | Type notebook() for the GUI, and license() for information. | ---------------------------------------------------------------------- sage: import os sage: os.system("ulimit -v") unlimited 0 Is this intentional? Does someone know how to turn this off? Our students and the people who share computers with them greatly benefit from programs crashing quickly instead of bothering other people. -- To post to this group, send an email to sage-devel@googlegroups.com To unsubscribe from this group, send an email to sage-devel+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/sage-devel URL: http://www.sagemath.org