Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-12 Thread Jeff Squyres (jsquyres)
The MPICH folks, who lurk on this list (gasp!), actually replied to me offlist and pointed out that they had pretty much the same problem, and pointed out a clever solution. Thanks, MPICH guys! I committed a fix to the trunk and to v1.6 (and will CMR it to v1.7, too). This is, admittedly a f

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-12 Thread Jeff Squyres (jsquyres)
You should definitely be able to disable fortran on the trunk. The option changed name, though -- disable-mpi-fortran (vs. --disable-mpi-f77), because we unified f77 and f90 support (e.g., mpifort, not mpif77 or mpif90). See this blog entry (which was written a while ago; it's now all on the t

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-12 Thread Mark Bolstad
Yup, gfortran was the problem. It works now. It also explains why the trunk version worked. In the trunk you can't disable fortran at all, so I had to uninstall gfortran. Thanks for all the help. Mark On Tue, Feb 12, 2013 at 8:21 AM, Jeff Squyres (jsquyres) wrote: > I looked closer at your co

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-12 Thread Jeff Squyres (jsquyres)
I looked closer at your configure output this morning, and I think I see the issue: I think your gfortran may be borked -- here's some output in config.log: - configure:163678: checking if gfortran supports -c -o file.o configure:163699: gfortran -c -o out/conftest2.o conftest.f >&5 i686-app

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-12 Thread Mark Bolstad
On Mon, Feb 11, 2013 at 10:44 PM, Jeff Squyres (jsquyres) < jsquy...@cisco.com> wrote: > I got your tarball (no need to re-send it). > > I'm a little confused by your output from make, though. > > Did you run autogen? If so, there's no need to do that -- try expanding a > fresh tarball and just r

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Jeff Squyres (jsquyres)
On Feb 11, 2013, at 3:41 PM, "Beatty, Daniel D CIV NAVAIR, 474300D" wrote: > The Intel+PPC is one issue. However, even on Intel, there tends to be a > distinction between Intel environments going from Xeon to Core iX > environments. While Objective-C/C/C++ handle this well, the Fortran > compil

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Jeff Squyres (jsquyres)
I got your tarball (no need to re-send it). I'm a little confused by your output from make, though. Did you run autogen? If so, there's no need to do that -- try expanding a fresh tarball and just running ./configure and make. On Feb 11, 2013, at 10:03 PM, Mark Bolstad wrote: > I packed t

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Mark Bolstad
I packed the compile info as requested but the message is to big. Changing the compression didn't help. I can split it, or do you just want to approve it out of the hold queue? Mark On Mon, Feb 11, 2013 at 3:03 PM, Jeff Squyres (jsquyres) wrote: > On Feb 11, 2013, at 2:46 PM, Mark Bolstad > wr

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Beatty, Daniel D CIV NAVAIR, 474300D
Hi Jeff, The Intel+PPC is one issue. However, even on Intel, there tends to be a distinction between Intel environments going from Xeon to Core iX environments. While Objective-C/C/C++ handle this well, the Fortran compilers have given me a different story over the years. It tends to be the case

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Jeff Squyres (jsquyres)
On Feb 11, 2013, at 2:46 PM, Mark Bolstad wrote: > That's what I noticed, no .so's (actually, I noticed that the dlname in the > .la file is empty. thank you, dtruss) Please send all the information listed here: http://www.open-mpi.org/community/help/ > I've built it two different ways: >

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Mark Bolstad
That's what I noticed, no .so's (actually, I noticed that the dlname in the .la file is empty. thank you, dtruss) I've built it two different ways: --disable-mpi-f77 and --prefix=/Users/bolstadm/papillon/build/macosx-x86_64/Release/openmpi-1.6.3 --disable-mpi-f77 --with-openib=no --enable-shared

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Jeff Squyres (jsquyres)
On Feb 11, 2013, at 1:11 PM, "Beatty, Daniel D CIV NAVAIR, 474300D" wrote: > There are two issues that have concerned me. One is universal capabilities, > namely ensuring that the library allows the same results for binaries in both > any of their universal compiled forms. Not sure what y

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Jeff Squyres (jsquyres)
Ah -- your plugins are all .a files. How did you configure/build Open MPI? On Feb 11, 2013, at 11:09 AM, Mark Bolstad wrote: > It's not just one plugin, it was about 6 of them. I just deleted the error > message from the others as I believed that opal_init was the problem. > > However, I hav

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Beatty, Daniel D CIV NAVAIR, 474300D
Greetings Fellow MPI users, I may need to get involved here on this issue also. I will need to do a similar number for Mountain Lion/ and regular Lion. I am still a little bit in design phase at this time so I am paying close attention to this thread. There are two issues that have concerned me.

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Mark Bolstad
It's not just one plugin, it was about 6 of them. I just deleted the error message from the others as I believed that opal_init was the problem. However, I have done a full build multiple times and have blown away all the plugins and other remnants of the build and install and get the same results

Re: [OMPI users] Building 1.6.3 on OS X 10.8

2013-02-11 Thread Jeff Squyres (jsquyres)
That's very idd; I cant think of why that would happen offhand. I build and run all the time on ML with no problems. Can you deleted that plugin and run ok? Sent from my phone. No type good. On Feb 10, 2013, at 10:22 PM, "Mark Bolstad" wrote: > I having some difficulties with building/running