adj_len_to_page doesn't return the correct result when the address is already page aligned and the length is bigger than a page. Fix that.
Signed-off-by: Aurelien Jarno <aurel...@aurel32.net> --- target/s390x/mem_helper.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) The patch 17 (improve MOVE LONG and MOVE LONG EXTENDED) triggered a bug in adj_len_to_page. This is the fix, I just didn't want to respin the series just for that. diff --git a/target/s390x/mem_helper.c b/target/s390x/mem_helper.c index 6add413531..7841808fa2 100644 --- a/target/s390x/mem_helper.c +++ b/target/s390x/mem_helper.c @@ -61,7 +61,7 @@ static inline uint32_t adj_len_to_page(uint32_t len, uint64_t addr) { #ifndef CONFIG_USER_ONLY if ((addr & ~TARGET_PAGE_MASK) + len - 1 >= TARGET_PAGE_SIZE) { - return -addr & ~TARGET_PAGE_MASK; + return (~addr & ~TARGET_PAGE_MASK) + 1; } #endif return len; -- 2.11.0