When resolving a string literal we examine each character against low
and high bounds of the expected type. We stored each character as an Int
and implicitly converted it to Uint twice: for "<" and ">" operators.
Now we store convert it to Uint explicitly and only once.

Cleanup related to handling of compile-time constraints errors.
Semantics is unaffacted.

Tested on x86_64-pc-linux-gnu, committed on trunk

gcc/ada/

        * sem_res.adb (Resolve_String_Literal): Avoid unnecessary
        conversions inside "<" and ">" bodies.
diff --git a/gcc/ada/sem_res.adb b/gcc/ada/sem_res.adb
--- a/gcc/ada/sem_res.adb
+++ b/gcc/ada/sem_res.adb
@@ -11722,14 +11722,14 @@ package body Sem_Res is
                Comp_Typ_Hi : constant Node_Id :=
                                Type_High_Bound (Component_Type (Typ));
 
-               Char_Val : Int;
+               Char_Val : Uint;
 
             begin
                if Compile_Time_Known_Value (Comp_Typ_Lo)
                  and then Compile_Time_Known_Value (Comp_Typ_Hi)
                then
                   for J in 1 .. Strlen loop
-                     Char_Val := Int (Get_String_Char (Str, J));
+                     Char_Val := UI_From_CC (Get_String_Char (Str, J));
 
                      if Char_Val < Expr_Value (Comp_Typ_Lo)
                        or else Char_Val > Expr_Value (Comp_Typ_Hi)


Reply via email to