Hi,

On 04.03.2015 15:18, Xiaodong Gong wrote:
> @@ -157,6 +178,224 @@ static int vpc_probe(const uint8_t *buf, int buf_size, 
> const char *filename)
..
> +static int vpc_decode_maxc_loc(BlockDriverState *bs, uint32_t data_length)
...
> +    cd = g_iconv_open("ASCII", "UTF8");
...
> +static int vpc_decode_w2u_loc(BlockDriverState *bs, uint32_t data_length)
...
> +    cd = g_iconv_open("ASCII", "UTF-16LE");

Please correct me if my understanding is wrong, but a hard-coded "ASCII"
is AFAIK wrong, as it only contains the 7-bit characters.

For the Linux kernel the file name is just a string of bytes, but when
it gets displayed to the user, the bytes are converted to characters.
The conversion depends on the locale used, which now-adays is most often
UTF-8 (LANG=de_DE.UTF-8, or more specifically LC_CTYPE), but some years
back it was ISO-8859-1 (or what-ever).

So if I create a backing file with some non-ASCII umlauts, the
conversion will break, as ß = ß = \uc39f = ISO-8859-1(0xdf)

AFAIK using nl_langinfo(CODESET) would return the codeset previously set
by setlocale(LC_ALL, ""), which any main program would need to do.

Am I missing something?

Sincerely
Philipp

Reply via email to