On Monday, October 2, 2017 11:12:42 AM CEST Jiri Slaby wrote: > These are all functions which are invoked from elsewhere, so we annotate > them as global using the new SYM_FUNC_START. And their ENDPROC's by > SYM_FUNC_END. > > And make sure ENTRY/ENDPROC is not defined on X86_64, given these were > the last users. > > Signed-off-by: Jiri Slaby <jsl...@suse.cz> > Cc: "H. Peter Anvin" <h...@zytor.com> > Cc: Thomas Gleixner <t...@linutronix.de> > Cc: Ingo Molnar <mi...@redhat.com> > Cc: x...@kernel.org > Cc: Herbert Xu <herb...@gondor.apana.org.au> > Cc: "David S. Miller" <da...@davemloft.net> > Cc: "Rafael J. Wysocki" <r...@rjwysocki.net> > Cc: Len Brown <len.br...@intel.com> > Cc: Pavel Machek <pa...@ucw.cz> > Cc: Matt Fleming <m...@codeblueprint.co.uk> > Cc: Ard Biesheuvel <ard.biesheu...@linaro.org> > Cc: Boris Ostrovsky <boris.ostrov...@oracle.com> > Cc: Juergen Gross <jgr...@suse.com> > Cc: linux-cry...@vger.kernel.org > Cc: linux...@vger.kernel.org > Cc: linux-...@vger.kernel.org > Cc: xen-de...@lists.xenproject.org > --- > arch/x86/boot/compressed/efi_thunk_64.S | 4 +- > arch/x86/boot/compressed/head_64.S | 16 ++++---- > arch/x86/crypto/aes-i586-asm_32.S | 8 ++-- > arch/x86/crypto/aes-x86_64-asm_64.S | 4 +- > arch/x86/crypto/aes_ctrby8_avx-x86_64.S | 12 +++--- > arch/x86/crypto/aesni-intel_asm.S | 44 > +++++++++++----------- > arch/x86/crypto/aesni-intel_avx-x86_64.S | 24 ++++++------ > arch/x86/crypto/blowfish-x86_64-asm_64.S | 16 ++++---- > arch/x86/crypto/camellia-aesni-avx-asm_64.S | 24 ++++++------ > arch/x86/crypto/camellia-aesni-avx2-asm_64.S | 24 ++++++------ > arch/x86/crypto/camellia-x86_64-asm_64.S | 16 ++++---- > arch/x86/crypto/cast5-avx-x86_64-asm_64.S | 16 ++++---- > arch/x86/crypto/cast6-avx-x86_64-asm_64.S | 24 ++++++------ > arch/x86/crypto/chacha20-avx2-x86_64.S | 4 +- > arch/x86/crypto/chacha20-ssse3-x86_64.S | 8 ++-- > arch/x86/crypto/crc32-pclmul_asm.S | 4 +- > arch/x86/crypto/crc32c-pcl-intel-asm_64.S | 4 +- > arch/x86/crypto/crct10dif-pcl-asm_64.S | 4 +- > arch/x86/crypto/des3_ede-asm_64.S | 8 ++-- > arch/x86/crypto/ghash-clmulni-intel_asm.S | 8 ++-- > arch/x86/crypto/poly1305-avx2-x86_64.S | 4 +- > arch/x86/crypto/poly1305-sse2-x86_64.S | 8 ++-- > arch/x86/crypto/salsa20-x86_64-asm_64.S | 12 +++--- > arch/x86/crypto/serpent-avx-x86_64-asm_64.S | 24 ++++++------ > arch/x86/crypto/serpent-avx2-asm_64.S | 24 ++++++------ > arch/x86/crypto/serpent-sse2-x86_64-asm_64.S | 8 ++-- > arch/x86/crypto/sha1-mb/sha1_mb_mgr_flush_avx2.S | 8 ++-- > arch/x86/crypto/sha1-mb/sha1_mb_mgr_submit_avx2.S | 4 +- > arch/x86/crypto/sha1-mb/sha1_x8_avx2.S | 4 +- > arch/x86/crypto/sha1_avx2_x86_64_asm.S | 4 +- > arch/x86/crypto/sha1_ni_asm.S | 4 +- > arch/x86/crypto/sha1_ssse3_asm.S | 4 +- > arch/x86/crypto/sha256-avx-asm.S | 4 +- > arch/x86/crypto/sha256-avx2-asm.S | 4 +- > .../crypto/sha256-mb/sha256_mb_mgr_flush_avx2.S | 8 ++-- > .../crypto/sha256-mb/sha256_mb_mgr_submit_avx2.S | 4 +- > arch/x86/crypto/sha256-mb/sha256_x8_avx2.S | 4 +- > arch/x86/crypto/sha256-ssse3-asm.S | 4 +- > arch/x86/crypto/sha256_ni_asm.S | 4 +- > arch/x86/crypto/sha512-avx-asm.S | 4 +- > arch/x86/crypto/sha512-avx2-asm.S | 4 +- > .../crypto/sha512-mb/sha512_mb_mgr_flush_avx2.S | 8 ++-- > .../crypto/sha512-mb/sha512_mb_mgr_submit_avx2.S | 4 +- > arch/x86/crypto/sha512-mb/sha512_x4_avx2.S | 4 +- > arch/x86/crypto/sha512-ssse3-asm.S | 4 +- > arch/x86/crypto/twofish-avx-x86_64-asm_64.S | 24 ++++++------ > arch/x86/crypto/twofish-x86_64-asm_64-3way.S | 8 ++-- > arch/x86/crypto/twofish-x86_64-asm_64.S | 8 ++-- > arch/x86/entry/entry_64.S | 10 ++--- > arch/x86/entry/entry_64_compat.S | 8 ++-- > arch/x86/kernel/acpi/wakeup_64.S | 8 ++-- > arch/x86/kernel/head_64.S | 12 +++--- > arch/x86/lib/checksum_32.S | 8 ++-- > arch/x86/lib/clear_page_64.S | 12 +++--- > arch/x86/lib/cmpxchg16b_emu.S | 4 +- > arch/x86/lib/cmpxchg8b_emu.S | 4 +- > arch/x86/lib/copy_page_64.S | 4 +- > arch/x86/lib/copy_user_64.S | 16 ++++---- > arch/x86/lib/csum-copy_64.S | 4 +- > arch/x86/lib/getuser.S | 16 ++++---- > arch/x86/lib/hweight.S | 8 ++-- > arch/x86/lib/iomap_copy_64.S | 4 +- > arch/x86/lib/memcpy_64.S | 4 +- > arch/x86/lib/memmove_64.S | 4 +- > arch/x86/lib/memset_64.S | 4 +- > arch/x86/lib/msr-reg.S | 8 ++-- > arch/x86/lib/putuser.S | 16 ++++---- > arch/x86/lib/rwsem.S | 20 +++++----- > arch/x86/mm/mem_encrypt_boot.S | 8 ++-- > arch/x86/platform/efi/efi_stub_64.S | 4 +- > arch/x86/platform/efi/efi_thunk_64.S | 4 +- > arch/x86/power/hibernate_asm_64.S | 8 ++--
For the hibernate changes: Reviewed-by: Rafael J. Wysocki <rafael.j.wyso...@intel.com>