Hi Janne,

This is an ABI change, so it is serious… it will require people to recompile 
older code and libraries with the new compiler. Do we already plan to break the 
ABI in this cycle, or is this the first ABI-breaking patch of the cycle? And do 
we have real-life examples of character strings larger than 2 GB?

> Also, as there are some places in the frontend were negative character
> lengths are used as special flag values, in the frontend the character
> length is handled as a signed variable of the same size as a size_t,
> although in the runtime library it really is size_t.

First, I thought: we should really make it size_t, and have the negative values 
be well-defined constants, e.g. (size_t) -1

On the other hand, there is the problem of the case where the front-end has 
different size_t than the target: think 32-bit on 64-bit i386 (front-end size_t 
larger than target size_t), or cross-compiling for 64-bit on a 32-bit machine 
(front-end size_t smaller than target size_t). So the charlen type bounds need 
to be determined when the front-end runs, not when it is compiled (i.e. it is 
not a fixed type).

In iresolve.c, the "Why is this fixup needed?” comment is kinda scary.


> I haven't changed the character length variables for the co-array
> intrinsics, as this is something that may need to be synchronized with
> OpenCoarrays.

Won’t that mean that coarray programs will fail due to ABI mismatch?


FX

Reply via email to