The setup code uses octal values with printf to generate a BOM for
UTF-16/32 BE/LE. It specifically uses '\777' to emit a 0xff byte. This
relies on the fact that most shells truncate the value above 0o377.

Ash however interprets '\777' as '\77' + a literal '7', resulting in an
invalid BOM.

Fix this by using the proper value of 0xff: '\377'.

Signed-off-by: Kevin Daudt <m...@ikke.info>
---
I do wonder why this code is using octal values in the first place,
rather than using hex values.

 t/t0028-working-tree-encoding.sh | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/t/t0028-working-tree-encoding.sh b/t/t0028-working-tree-encoding.sh
index 8936ba6757..c6b68c22ca 100755
--- a/t/t0028-working-tree-encoding.sh
+++ b/t/t0028-working-tree-encoding.sh
@@ -49,12 +49,12 @@ test_expect_success 'setup test files' '
        # BOM tests
        printf "\0a\0b\0c"                         >nobom.utf16be.raw &&
        printf "a\0b\0c\0"                         >nobom.utf16le.raw &&
-       printf "\376\777\0a\0b\0c"                 >bebom.utf16be.raw &&
-       printf "\777\376a\0b\0c\0"                 >lebom.utf16le.raw &&
+       printf "\376\377\0a\0b\0c"                 >bebom.utf16be.raw &&
+       printf "\377\376a\0b\0c\0"                 >lebom.utf16le.raw &&
        printf "\0\0\0a\0\0\0b\0\0\0c"             >nobom.utf32be.raw &&
        printf "a\0\0\0b\0\0\0c\0\0\0"             >nobom.utf32le.raw &&
-       printf "\0\0\376\777\0\0\0a\0\0\0b\0\0\0c" >bebom.utf32be.raw &&
-       printf "\777\376\0\0a\0\0\0b\0\0\0c\0\0\0" >lebom.utf32le.raw &&
+       printf "\0\0\376\377\0\0\0a\0\0\0b\0\0\0c" >bebom.utf32be.raw &&
+       printf "\377\376\0\0a\0\0\0b\0\0\0c\0\0\0" >lebom.utf32le.raw &&
 
        # Add only UTF-16 file, we will add the UTF-32 file later
        cp test.utf16.raw test.utf16 &&
-- 
2.19.1

Reply via email to