Paolo Bonzini wrote:
Kenneth Zadeck wrote:
I think that it is time that we in the GCC community took some time to
address the problem of compiling very large functions in a somewhat
systematic manner.
While I agree with you, I think that there are so many things we are
already trying to address, that this one can wait.
It certainly can, but I see no reason why it should. This is a class of
issues that users run into, and if someone is motivated to work on this
class, then that's great!
I like Kenny's idea of having a uniform set of metrics for size (e.g.,
number of basic blocks, number of variables, etc.) and a limited set of
gating functions because it will allow us to explain what's going on to
users, and allow users to tune them. For example, if the metric for
disabling a pass (by default) is "# basic blocks > 10", then we can have
a -foptimize-bigger=2 switch to change that to "20". If the gating
condition was instead some arbitrary computation, that would be harder
to implement, and harder to explain.
Certainly, setting the default thresholds reasonably will be
non-trivial. If we can agree on the basic mechanism, though, we could
add thresholding on a pass-by-pass basis.
--
Mark Mitchell
CodeSourcery
[EMAIL PROTECTED]
(650) 331-3385 x713