From: Matt Sealey <matt.sea...@arm.com> Enable faster 8-byte copies on arm64.
Link: http://lkml.kernel.org/r/20181127161913.23863-6-dave.rodg...@arm.com Signed-off-by: Matt Sealey <matt.sea...@arm.com> Signed-off-by: Dave Rodgman <dave.rodg...@arm.com> Cc: David S. Miller <da...@davemloft.net> Cc: Greg Kroah-Hartman <gre...@linuxfoundation.org> Cc: Herbert Xu <herb...@gondor.apana.org.au> Cc: Markus F.X.J. Oberhumer <mar...@oberhumer.com> Cc: Minchan Kim <minc...@kernel.org> Cc: Nitin Gupta <nitingupta...@gmail.com> Cc: Richard Purdie <rpur...@openedhand.com> Cc: Sergey Senozhatsky <sergey.senozhatsky.w...@gmail.com> Cc: Sonny Rao <sonny...@google.com> Signed-off-by: Andrew Morton <a...@linux-foundation.org> Signed-off-by: Stephen Rothwell <s...@canb.auug.org.au> --- lib/lzo/lzodefs.h | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/lib/lzo/lzodefs.h b/lib/lzo/lzodefs.h index c8965dc181df..06fa83a38e0a 100644 --- a/lib/lzo/lzodefs.h +++ b/lib/lzo/lzodefs.h @@ -15,7 +15,7 @@ #define COPY4(dst, src) \ put_unaligned(get_unaligned((const u32 *)(src)), (u32 *)(dst)) -#if defined(CONFIG_X86_64) +#if defined(CONFIG_X86_64) || defined(CONFIG_ARM64) #define COPY8(dst, src) \ put_unaligned(get_unaligned((const u64 *)(src)), (u64 *)(dst)) #else -- 2.17.1