> >>>>> Wilco Dijkstra wrote: > > > It relies on static branch probabilities, which are set completely wrong in > > GCC, > so it ends up optimizing the hot path in many functions for size > rather than speed > and visa versa. > > This sounds like the static branch prediction issue that we have been > discussing with Honza. Honza discussed plans to improve this > infrastructure during this development cycle.
In fact I just started on refactoring the profile updating code. However I do not see how it is going to help there or what is wrong. In the testcase: void g(void); int a; int f(void) { g(); if (a < 0x7ffffffe) // or != 0 or < 0 or a < 0x7ffffffe return -1; a = 1; return 1; } triggers negative return heuristics that says that with 96% probability the function will not return negative constant (as those are usually error states). This further combine with default compare heuristcs but it won't outvote this one to make return -1 more likely and return 1; With opcode, we predict that comparsions a==b are false when b is not 0. We also predict that values are positive. This is what triggers for test a>0 What is the particular problem with this? Honza > > - David