Daiki Ueno <u...@gnu.org> writes: > After the following change (in gnulib), locale_charset() returns "ASCII" > on Mac OS X, even if thread's locale is set to "de_DE.UTF-8", say. So > the solution would be either to make sure MB_CUR_MAX > 1 somehow in the > test, or to make locale_charset() thread-safe.
Just tried the latter. I've tested the attached patch on Mac OS X and it still returns "ASCII" until the first uselocale() or setlocale(). --- ChangeLog | 6 ++++++ lib/localcharset.c | 7 ++++++- 2 files changed, 12 insertions(+), 1 deletion(-) diff --git a/ChangeLog b/ChangeLog index 413e435..87e2339 100644 --- a/ChangeLog +++ b/ChangeLog @@ -1,3 +1,9 @@ +2012-12-20 Daiki Ueno <u...@gnu.org> + + localecharset: make locale_charset thread-safe on Mac OS X + * lib/localcharset.c (locale_charset) [DARWIN7]: Use MB_CUR_MAX_L + instead of MB_CUR_MAX. + 2012-12-08 Stefano Lattarini <stefano.lattar...@gmail.com> maint.mk: avoid extra forks diff --git a/lib/localcharset.c b/lib/localcharset.c index 1a94042..1ad03d7 100644 --- a/lib/localcharset.c +++ b/lib/localcharset.c @@ -65,6 +65,11 @@ # include <os2.h> #endif +/* For MB_CUR_MAX_L */ +#if defined DARWIN7 +# include <xlocale.h> +#endif + #if ENABLE_RELOCATABLE # include "relocatable.h" #else @@ -545,7 +550,7 @@ locale_charset (void) #ifdef DARWIN7 /* Mac OS X sets MB_CUR_MAX to 1 when LC_ALL=C, and "UTF-8" (the default codeset) does not work when MB_CUR_MAX is 1. */ - if (strcmp (codeset, "UTF-8") == 0 && MB_CUR_MAX <= 1) + if (strcmp (codeset, "UTF-8") == 0 && MB_CUR_MAX_L (uselocale (NULL)) <= 1) codeset = "ASCII"; #endif -- 1.7.11.7 Regards, -- Daiki Ueno