------- Comment #1 from paolo dot carlini at oracle dot com 2008-10-30 16:07 ------- (In reply to comment #0) > It seems that implementation always set str.eofbit when after call of the > function (in == end). > > But standard states(22.2.2.1.2, p16) that this flag should be set only when: > "if, when seeking another character to match, it is found that (in == end)" > (on > success) > or "if the reason for the failure was that (in == end)" (on fail) > > This conditions are not the same as simply (in == end).
Frankly, I don't get it. In my reading of the standard, either when val is set or val is not set, when in == end at the end then we have eofbit. Maybe you should simply attach a testcase where the behavior is incorrect, the provided one is fine (and consistent with the general behavior for numeric parsing). -- http://gcc.gnu.org/bugzilla/show_bug.cgi?id=37958