Hello, On 19 February 2013 13:31, Phil Holmes <m...@philholmes.net> wrote:
> ----- Original Message ----- From: James > To: Phil Holmes > Cc: bug-lilypond@gnu.org > Sent: Tuesday, February 19, 2013 12:42 PM > Subject: Re: Error when compiling a large file > > > Hello, >> > > Yes there is a limit per process - the software company I work >> for hits that all the time so we have to spawn multiple instances >> of a particular process to get the RAM we need to use for a >> given driver. Slightly different to what LP does but there are limits. >> > > James >> > > I've just written a C# program that uses Marshal.AllocHGlobal to allocate > global memory repeatedly. I ran it until it had grabbed more than 5 Gigs, > so it's not a fundamental limit of the architecture. > > The other programs were running rather more slowly with this one going.... http://msdn.microsoft.com/en-us/library/aa366778(VS.85).aspx There are limits but with x64 I cannot believe we are hitting them in this case. But that in my case and the user we are running our of available 'free' memory (whatever that technically means in Windows' World). I am no coder, but I seem to recall that there is a difference between system memory and RAM and that windows will allow allocation of more than it physically has in various cases, but this seems to be something more fundamental. James _______________________________________________ bug-lilypond mailing list bug-lilypond@gnu.org https://lists.gnu.org/mailman/listinfo/bug-lilypond