On Mon, Dec 16, 2024 at 04:53:42AM -0800, Damian Rouson wrote: > including automatic GPU offloading. Then a few months ago, the death blow > that I couldn’t work around was robust support for kind type parameters. >
gfortran doesn't have robust kind type parameters? % cat xx.f90 program foo use iso_fortran_env print '(A,*(I0,1X))', ' real kinds: ', real_kinds print '(A,*(I0,1X))', ' integer kinds: ', integer_kinds print '(A,*(I0,1X))', ' logical kinds: ', logical_kinds print '(A,*(I0,1X))', ' character kinds: ', character_kinds end program foo % gfcx -o z xx.f90 && ./z real kinds: 4 8 10 16 integer kinds: 1 2 4 8 16 logical kinds: 1 2 4 8 16 character kinds: 1 4 % ../tmp/lfortran/work/bin/lfortran -o z xx.f90 Segmentation fault (core dumped) % root/build/bin/flang -o z xx.f90 && ./z real kinds: 2 3 4 8 10 integer kinds: 1 2 4 8 16 logical kinds: 1 2 4 8 character kinds: 1 2 4 So, flang supports IEEE binary16 and brain float16, which makes sense as Nvidia has a vested interest in these types. flang does not support IEEE binary128; while gfortran does. flang also has an additional character kind type. Are you doing work with 'CHARACTER(KIND=2)' while not needing a 16-byte logical kind type? Or, perhaps, you care to qualify your generalization? -- steve