On Dec 14, 2006, at 1:48 PM, Michael Galloway wrote:

On Thu, Dec 14, 2006 at 02:01:54PM -0500, Michael Galloway wrote:
good day all, i've been trying to build ompi with the 6.2-X version of the pgi compiler set (pgcc 6.2-4 64-bit target on x86-64 Linux). i've tried both 1.1.2 and the current nightly build 1.1.3b2r12766, both fail with this error from configure:

*** Fortran 77 compiler
checking whether we are using the GNU Fortran 77 compiler... no
checking whether /opt/pgi/linux86-64/6.2/bin/pgf77 accepts -g... yes
checking if Fortran 77 compiler works... yes
checking /opt/pgi/linux86-64/6.2/bin/pgf77 external symbol convention... single underscore
checking if Fortran 77 compiler supports LOGICAL... yes
checking size of Fortran 77 LOGICAL... 4
checking for C type corresponding to LOGICAL... not found
configure: WARNING: *** Did not find corresponding C type
configure: error: Cannot continue

It appears that this is being caused by a problem much earlier in the configure process. For some reason I don't totally understand, the configure script is finding the size of shorts, ints, longs, long longs, void *s, etc. to be 2. This is rather unexpected on a 64-bit machine, obviously. In testing with PGI 6.1-3 on x86_64, it looks like the IPA option is causing this to occur -- if I remove the IPA option from the flags, configure runs properly.

In my (somewhat humble) opinion, turning on such aggressive options is dangerous when compiling Open MPI (or most MPI implementations). With share memory and RDMA interconnects, assumptions compiler writers make when running at such high optimization setting are frequently not true for codes like Open MPI. And the performance gain in compiling the MPI layer with such high optimizations is frequently near zero (that being said, there is a significant advantage to adding some optimization flags and not including debugging symbols in the MPI library if you don't need them).

It should be safe to compile Open MPI without the IPA options and still use IPA when compiling your application.


Brian

--
  Brian Barrett
  Open MPI Team, CCS-1
  Los Alamos National Laboratory


Reply via email to