This moves the exception path to being out-of-line within the function, rather
than in the .fixup section, which improves backtraces.

Because the macro is used multiple times, the fault label needs declaring as
local.

No functional change.

Signed-off-by: Andrew Cooper <andrew.coop...@citrix.com>
---
CC: Jan Beulich <jbeul...@suse.com>
CC: Roger Pau Monné <roger....@citrix.com>

Slightly RFC.  I haven't checked if Eclair will be happy with __label__ yet.

It is disappointing that, unless we retain the xor/mov for the exception path,
GCC decides to emit worse code, notably duplicating the mov %ds success path
in mov %es's error path.

The "+r" constraint was actually wrong before; the asm only produces
all_segs_okay and does not consume it.  Given leeway, GCC decides to manifest
$1 in a different register on each error path and OR them together (inverted,
I'm guessing) to reconstitute all_segs_okay.

Still, we've got rid of the manual jmp...
---
 xen/arch/x86/domain.c | 27 ++++++++++++++++-----------
 1 file changed, 16 insertions(+), 11 deletions(-)

diff --git a/xen/arch/x86/domain.c b/xen/arch/x86/domain.c
index 56c381618712..d795e5b968e2 100644
--- a/xen/arch/x86/domain.c
+++ b/xen/arch/x86/domain.c
@@ -1738,17 +1738,22 @@ static void load_segments(struct vcpu *n)
      * @all_segs_okay in function scope, and load NUL into @sel.
      */
 #define TRY_LOAD_SEG(seg, val)                          \
-    asm_inline volatile (                               \
-        "1: mov %k[_val], %%" #seg "\n\t"               \
-        "2:\n\t"                                        \
-        ".section .fixup, \"ax\"\n\t"                   \
-        "3: xor %k[ok], %k[ok]\n\t"                     \
-        "   mov %k[ok], %%" #seg "\n\t"                 \
-        "   jmp 2b\n\t"                                 \
-        ".previous\n\t"                                 \
-        _ASM_EXTABLE(1b, 3b)                            \
-        : [ok] "+r" (all_segs_okay)                     \
-        : [_val] "rm" (val) )
+    ({                                                  \
+        __label__ fault;                                \
+        asm_inline volatile goto (                      \
+            "1: mov %k[_val], %%" #seg "\n\t"           \
+            _ASM_EXTABLE(1b, %l[fault])                 \
+            :: [_val] "rm" (val)                        \
+            :: fault );                                 \
+        if ( 0 )                                        \
+        {                                               \
+        fault: __attribute__((cold));                   \
+            asm_inline volatile (                       \
+                "xor %k[ok], %k[ok]\n\t"                \
+                "mov %k[ok], %%" #seg                   \
+                : [ok] "=r" (all_segs_okay) );          \
+        }                                               \
+    })
 
     if ( !compat )
     {
-- 
2.39.5


Reply via email to