On 8/10/24 13:24, Gerald Pfeifer wrote:

+With @code{-fprofile-use} all portions of programs not executed during
+training runs are optimized aggressively for size rather than speed.
+In some cases it is not practical to train all possible hot paths in
+the program. (For example, a program may contain functions specific to
+a given hardware and training may not cover all hardware configurations
+the program later runs on.)  With @code{-fprofile-partial-training}
+profile feedback will be ignored for all functions not executed during
+the training, them being optimized as if they were compiled without
+profile feedback. This leads to better performance when the training
+is not representative at the cost of significantly bigger code.

Hmmm, this is still pretty confusing; I had to read this 3 or 4 times before I realized that the first sentence was describing behavior *without* this option instead of what the option does. :-S I suggest putting the most important info first, maybe like

This option modifies the behavior of @option{-fprofile-use} on functions that were not executed during training runs. Normally @option{-fprofile-use} causes such functions to be optimized aggressively for size; @option{-fprofile-partial-training} instead causes them to be optimized as if they were compiled without profile feedback.

This option is useful when it is not practical to train all possible hot paths in the program. (For example, a program may contain functions specific to a given hardware and training may not cover all hardware configurations the program later runs on.) In cases where the training runs are not representative, this option improves performance at
the cost of significantly larger code.

-Sandra

Reply via email to