On Mon, 2018-07-09 at 10:57 -0600, Jeff Law wrote:
> On 07/09/2018 10:53 AM, Janus Weil wrote:
> > 2018-07-09 18:35 GMT+02:00 Eric S. Raymond <e...@thyrsus.com>:
> > > David Edelsohn <dje....@gmail.com>:
> > > > > The truth is we're near the bleeding edge of what conventional tools
> > > > > and hardware can handle gracefully.  Most jobs with working sets as
> > > > > big as this one's do only comparatively dumb operations that can be
> > > > > parallellized and thrown on a GPU or supercomputer.  Most jobs with
> > > > > the algorithmic complexity of repository surgery have *much* smaller
> > > > > working sets.  The combination of both extrema is hard.
> > > > 
> > > > If you come to the conclusion that the GCC Community could help with
> > > > resources, such as the GNU Compile Farm or paying for more RAM, let us
> > > > know.
> > > 
> > > 128GB of DDR4 registered RAM would allow me to run conversions with my
> > > browser up, but be eye-wateringly expensive.  Thanks, but I'm not
> > > going to yell for that help
> > 
> > I for one would certainly be happy to donate some spare bucks towards
> > beastie RAM if it helps to get the GCC repo converted to git in a
> > timely manner, and I'm sure there are other GCC
> > developers/users/sympathizers who'd be willing to join in. So, where
> > do we throw those bucks?
> 
> I'd be willing to throw some $$$ at this as well.

I may be misreading between the lines but I suspect Eric is more hoping
to get everyone to focus on moving this through before the GCC commit
count gets even more out of control, than he is asking for a hardware
handout :).

Maybe the question should rather be, what does the dev community need
to do to help push this conversion through soonest?

Reply via email to