On 11/4/06, Kenneth Zadeck <[EMAIL PROTECTED]> wrote:
I think that it is time that we in the GCC community took some time to
address the problem of compiling very large functions in a somewhat
systematic manner.
GCC has two competing interests here: it needs to be able to provide
state of the art optimization for modest sized functions and it needs to
be able to properly process very large machine generated functions using
reasonable resources.
I believe that the default behavior for the compiler should be that
certain non essential passes be skipped if a very large function is
encountered.
There are two problems here:
1) defining the set of optimizations that need to be skipped.
2) defining the set of functions that trigger the special processing.
For (1) I would propose that three measures be made of each function.
These measures should be made before inlining occurs. The three measures
are the number of variables, the number of statements, and the number of
basic blocks.
Why before inlining? These three numbers can change quite significantly
as a function passes through the pass pipeline. So we should try to keep
them up-to-date to have an accurate measurement.
Otherwise the proposal sounds reasonable but we should make sure the
limits we impose allow reproducible compilations for N x M cross
configurations and native compilation on different sized machines.
Richard.