Re: [OMPI users] Problem with PGI 6.1 and OpenMPI-1.1.1

2006-10-20 Thread Jeffrey B. Layton
Jeff Squyres wrote: Two questions: 1. Have you tried the just-released 1.1.2? No, not yet. 2. Are you closing stdin/out/err? How do you do this? I did get some help on how to fix the problem by adding ' < /dev/null' at the very end of the mpirun line. This seems to have fixed the problem.

Re: [OMPI users] Problem with PGI 6.1 and OpenMPI-1.1.1

2006-10-19 Thread Jeffrey B. Layton
A small update. I was looking through the error file a bit more (it was 159MB). I found the following error message sequence: o1:22805] mca_oob_tcp_accept: accept() failed with errno 9. [o4:11242] [0,1,4]-[0,0,0] mca_oob_tcp_peer_recv_blocking: recv() failed with errno=104 [o1:22805] mca_oob_tc

[OMPI users] Problem with PGI 6.1 and OpenMPI-1.1.1

2006-10-19 Thread Jeffrey B. Layton
Good afternoon, I really hate to post asking for help with a problem, but my own efforts have not worked out well (probably operator error). Anyway, I'm trying to run a code that was built with PGI 6.1 and OpenMPI-1.1.1. The mpirun command looks like: mpirun --hostfile machines.${PBS_JOBID}

Re: [OMPI users] Problem with 1.0.2 and PGI 6.0

2006-04-13 Thread Jeffrey B. Layton
This is all I get. No core dump, no nothing :( Do you get any more of an error message than that? Did the process dump core, and if so, what does a backtrace show? -Original Message- From: users-boun...@open-mpi.org [mailto:users-boun...@open-mpi.org] On Behalf Of Jeffrey B

Re: [OMPI users] Building 1.0.2 with Intel 9.0

2006-04-12 Thread Jeffrey B. Layton
Troy Telford wrote: On Wed, 12 Apr 2006 14:34:06 -0600, Jeffrey B. Layton wrote: Good afternoon, While we're on the subject of building OpenMPI-1.0.2 with Intel, I'm having trouble building OpenMPI-1.0.2 with Intel 9.0. I'm starting to wonder if I'm doing som

[OMPI users] Problem with 1.0.2 and PGI 6.0

2006-04-12 Thread Jeffrey B. Layton
Hello, I got OpenMPI 1.0.2 built with PGI 6.0 that fixed my previous problem (problem with 1.0.1 and multiple tcp networks). However, when I tried to run the "is" code from the NPB I get the following error: [0] func:/home/jlayton/bin/OPENMPI-1.0.2-PGI6.0-OPTERON/lib/libopal.so.0 [0x2a95e2d4a

[OMPI users] Building 1.0.2 with Intel 9.0

2006-04-12 Thread Jeffrey B. Layton
Good afternoon, While we're on the subject of building OpenMPI-1.0.2 with Intel, I'm having trouble building OpenMPI-1.0.2 with Intel 9.0. I'm building OpenMPI-1.0.2 with the following options: ./configure --prefix=/home/jlayton/bin/OPENMPI-1.0.2-INTEL9.0-EM64T --disable-io-romio \ --e

Re: [OMPI users] Problem running code with OpenMPI-1.0.1

2006-04-12 Thread Jeffrey B. Layton
ssage- From: users-boun...@open-mpi.org [mailto:users-boun...@open-mpi.org] On Behalf Of Jeffrey B. Layton Sent: Tuesday, April 11, 2006 11:25 AM To: Open MPI Users Subject: [OMPI users] Problem running code with OpenMPI-1.0.1 Good morning, I'm trying to run one of the NAS Parallel

Re: [OMPI users] Problem running code with OpenMPI-1.0.1

2006-04-11 Thread Jeffrey B. Layton
. Can you give that a whirl? -Original Message- From: users-boun...@open-mpi.org [mailto:users-boun...@open-mpi.org] On Behalf Of Jeffrey B. Layton Sent: Tuesday, April 11, 2006 11:25 AM To: Open MPI Users Subject: [OMPI users] Problem running code with OpenMPI-1.0.1 Good mornin

[OMPI users] Problem running code with OpenMPI-1.0.1

2006-04-11 Thread Jeffrey B. Layton
Good morning, I'm trying to run one of the NAS Parallel Benchmarks (bt) with OpenMPI-1.0.1 that was built with PGI 6.0. The code never starts (at least I don't see any output) until I kill the code. Then I get the following message: [0,1,2][btl_tcp_endpoint.c:559:mca_btl_tcp_endpoint_complete_

Re: [OMPI users] Problem building OpenMPI 1.0.1 with PGI 6.0

2006-04-05 Thread Jeffrey B. Layton
static version (and thus increasing the compilation speed) you should specify "--enable-static --disable-shared". Thanks, george. On Apr 5, 2006, at 5:30 PM, Jeffrey B. Layton wrote: Good afternoon (evening), I'm trying to build OpenMPI-1.0.1 on a SLES9 syst

[OMPI users] Problem building OpenMPI 1.0.1 with PGI 6.0

2006-04-05 Thread Jeffrey B. Layton
Good afternoon (evening), I'm trying to build OpenMPI-1.0.1 on a SLES9 system with PGI 6.0 (gcc and pgcc). I'm disabling romio and enabling static libraries: ./configure --prefix=/home/jlayton/bin/OPENMPI-1.0.1-PGI6.0-OPTERON --disable-romio \ --enable-static During the build I get th