On Mon, Dec 19, 2016 at 08:43:01AM -0800, Bob Deen wrote: > > It's one thing to break the ABI between the compiler and the gfortran > library; those can generally be expected to be in sync. It's another to > break the ABI between two *languages*, when there might be no such > expectation (especially if gcc does NOT break their ABI at the same > version number transition). Yes, the pre-ISO_C_BINDING method may be > old-fashioned, but it is a de-facto standard, and breaking it should not > be done lightly.
Do you really think that those of us who actively contribute to gfortran development take breaking the ABI lightly? We have put off changes to gfortran's library for several years to specifically avoid ABI breakage. It seems that there is never a "Good Time" to break the ABI. However, in this case, support for F2008 9.6.4.8, Defined Input/Output, necessitates a change in the ABI. Instead of breaking the ABI multiple times, it has been decided to try to cleanup some long standing issues with libgfortran. > If you do proceed with changing the size, I would request that there at > least be a facility to reliably tell at compile time (on the C side) > which definition is being used, so I can adjust our macros accordingly. > Our code does depend on the size, and it has to cross-platform (and now, > if this change is made, cross-version), so with this change I would have > to support both int and size_t. As the breakage is going to occur with gfortran 7.0, you do % cat a.F90 #if defined(__GFORTRAN__) && (__GNUC__ > 6) print *, '7' #else print *, 'not 7' #endif end % gfc7 -E a.F90 | cat -s ] gfc7 -E a.F90 | cat -s # 1 "a.F90" # 1 "<built-in>" # 1 "<command-line>" # 1 "a.F90" print *, '7' end % gfortran6 -E a.F90 | cat -s # 1 "a.F90" # 1 "<built-in>" # 1 "<command-line>" # 1 "a.F90" print *, 'not 7' end > Perhaps it might be best to wait until a time when gcc is also breaking > their ABI, so that there's no question of code (on either side) working > across the transition...? There is never a good time. If we are to wait for gcc, should we remove support for Defined Input/Output from the compiler? -- Steve