On Thu, 2019-12-19 at 17:06 -0600, Qing Zhao wrote:
> Hi, Dmitry,
> 
> Thanks for the responds. 
> 
> Yes, routine size only cannot determine the complexity of the routine. 
> Different compiler analysis might have different formula with multiple 
> parameters to compute its complexity. 
> 
> However, the common issue is: when the complexity of a specific routine for a 
> specific compiler analysis exceeds a threshold, the compiler might consume 
> all the available memory and abort the compilation. 
> 
> Therefore,  in order to avoid the failed compilation due to out of memory, 
> some compilers might set a threshold for the complexity of a specific 
> compiler analysis (for example, the more aggressive data flow analysis), when 
> the threshold is met, the specific aggressive analysis will be turned off for 
> this specific routine. Or the optimization level will be lowered for the 
> specific routine (and given a warning during compilation time for such 
> adjustment).  
> 
> I am wondering whether GCC has such capability? Or any option provided to 
> increase or decrease the threshold for some of the common analysis (for 
> example, data flow)?
> 
There are various places where if we hit a limit, then we throttle
optimization.  But it's not done consistently or pervasively.

Those limits are typically around things like CFG complexity.

We do _not_ try to recover after an out of memory error, or anything
like that.

jeff
> 

Reply via email to