On 01/10/2018 02:51 PM, David Woodhouse wrote:

> + */
> +#define __FILL_RETURN_BUFFER(reg, sp, uniq)  \
> +     mov     $(NUM_BRANCHES_TO_FILL/2), reg; \
> +     .align  16;                             \
> +.Ldo_call1_ ## uniq:                         \
> +     call    .Ldo_call2_ ## uniq;            \
> +.Ltrap1_ ## uniq:                            \
> +     pause;                                  \
> +     jmp     .Ltrap1_ ## uniq;               \
> +     .align  16;                             \
> +.Ldo_call2_ ## uniq:                         \
> +     call    .Ldo_loop_ ## uniq;             \
> +.Ltrap2_ ## uniq:                            \
> +     pause;                                  \
> +     jmp     .Ltrap2_ ## uniq;               \
> +     .align  16;                             \
> +.Ldo_loop_ ## uniq:                          \
> +     dec     reg;                            \
> +     jnz     .Ldo_call1_ ## uniq;            \
> +     add     $(BITS_PER_LONG/8)*NUM_BRANCHES_TO_FILL, sp;
> +
>  #ifdef __ASSEMBLY__
>  

> +
> +     asm volatile (ALTERNATIVE("",
> +                               __stringify(__FILL_RETURN_BUFFER(%0, %1, 
> _%=)),
> +                               X86_FEATURE_RETPOLINE)

We'll be patching in a fairly long set of instructions here.  Maybe put
the ALTERNATIVE in the assembly and use a jmp skip_\@ for the ALTERNATIVE.

Thanks.

Tim

Reply via email to