https://gcc.gnu.org/bugzilla/show_bug.cgi?id=67918
--- Comment #6 from Nameless <11throwaway11 at outlook dot com> --- Another update: In Crypto++ there was a class with a virtual function, in a header file: https://github.com/weidai11/cryptopp/blob/b7de164d6251dc066123b59bc15d30c74e920756/modes.h#L71 There were two derived classes that overridden this function, but called the version of the base class: 1) https://github.com/weidai11/cryptopp/blob/b7de164d6251dc066123b59bc15d30c74e920756/modes.h#L173 2) https://github.com/weidai11/cryptopp/blob/b7de164d6251dc066123b59bc15d30c74e920756/modes.h#L232 Moving definitions of these functions from a header file to a .cpp-file and recompiling the library made the testcase work: https://github.com/weidai11/cryptopp/commit/8134f2cd502e457201d65f1dc557268ba13e3663 Worth noting that only -fdevirtualize is actually not enough, it's a combination of -O1 and -fdevirtualize. -O0 -fdevirtualize doesn't crash. I couldn't narrow it down to a specific -O1 optimization, because applying all those flags mentioned under -O1: https://gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html ... did not make the output binary crash. I guess it's a case of "Not all optimizations are controlled directly by a flag. Only optimizations that have a flag are listed in this section. Most optimizations are only enabled if an -O level is set on the command line".