David Edelsohn <dje....@gmail.com>:
> > The truth is we're near the bleeding edge of what conventional tools
> > and hardware can handle gracefully.  Most jobs with working sets as
> > big as this one's do only comparatively dumb operations that can be
> > parallellized and thrown on a GPU or supercomputer.  Most jobs with
> > the algorithmic complexity of repository surgery have *much* smaller
> > working sets.  The combination of both extrema is hard.
> 
> If you come to the conclusion that the GCC Community could help with
> resources, such as the GNU Compile Farm or paying for more RAM, let us
> know.

128GB of DDR4 registered RAM would allow me to run conversions with my
browser up, but be eye-wateringly expensive.  Thanks, but I'm not
going to yell for that help unless the working set gets so large that
it blows out 64GB even with nothing but i4 and some xterms running.

Unfortunately that is a contingency that no longer seems impossible.

(If you're not familar, i4 is a minimalist tiling window manager with
a really small working set. I like it and would use it even if I
didn't have a memory-crowding problem.  Since I do it is extra helpful.)
-- 
                <a href="http://www.catb.org/~esr/";>Eric S. Raymond</a>

My work is funded by the Internet Civil Engineering Institute: https://icei.org
Please visit their site and donate: the civilization you save might be your own.


Reply via email to