> From: Tomas Kalibera <tomas.kalib...@gmail.com> 
> > Can I use the 64-bit gcc to build a 32-bit package with the -m32 
> > command line option with Rtools?  And, can that work for CRAN?  Or 
> > more generally, is there a work-around for needing lots of RAM during 
> > compilation with the 32-bit compiler?
> >
> > The background is:
> >
> > I'm trying to compile a the development version of RxODE 
> > (https://github.com/nlmixrdevelopment/RxODE/issues/278), but I'm 
> > hitting 32-bit memory limits (using >3GB and possibly >4GB RAM during 
> > compilation) using the 32-bit version of gcc.
>
> I think this is too much memory to be used for compilation. I think it would 
> be best to
> simplify the code, possibly split it, or just reduce the optimization level, 
> as I read you have > done already anyway. Maybe it doesn't have to be -O0, 
> maybe you can enable some. In 
> the past I've seen similar cases when inlining too aggressively in large 
> files, maybe you
> could just reduce that a bit. It may very well be that reducing the 
> optimization level just a 
> little bit will provide about the same performance, but require far less 
> memory at compile 
> time (in the past there have been cases when -O3 did not produce faster code 
> than -O2 on
> a set of standard benchmarks, of course that may be different in today's 
> compilers).

Yes, we are testing reducing the optimization level.  Using -O0 works to 
compile using ~500 MB.  The code can't be simplified as it is a very long 
algebraic equation.  The function is often used in the middle of an 
optimization routine that can often takes minutes to hours (statistical 
minimization here, not meaning compiler optimization), so optimization is 
preferable.  But, it's a fair point that I don't know the value of the 
different -O optimization levels for what is only long algebra.

Thanks,

Bill

______________________________________________
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel

Reply via email to