> I'm not sure I understand the problem. The ale3d program from
> LLNL operates exactly as you describe and it can be built
> with mpich, lam, or openmpi.
Hi Doug,
I'm not sure what reply would be most helpful, so here's an attempt.
It sounds like we're on the same page with regard to the desire
Am 22.10.2008 um 00:08 schrieb Adams, Brian M:
-Original Message-
From: users-boun...@open-mpi.org
[mailto:users-boun...@open-mpi.org] On Behalf Of Reuti
Sent: Tuesday, October 21, 2008 11:36 AM
To: Open MPI Users
Subject: Re: [OMPI users] OpenMPI runtime-specific
environment variable?
Brian,
I'm not sure I understand the problem. The ale3d program from LLNL
operates exactly as you describe and it can be built with mpich, lam,
or openmpi.
Doug Reeder
On Oct 21, 2008, at 3:08 PM, Adams, Brian M wrote:
-Original Message-
From: users-boun...@open-mpi.org
[mailto:us
> -Original Message-
> From: users-boun...@open-mpi.org
> [mailto:users-boun...@open-mpi.org] On Behalf Of Reuti
> Sent: Tuesday, October 21, 2008 11:36 AM
> To: Open MPI Users
> Subject: Re: [OMPI users] OpenMPI runtime-specific
> environment variable?
>
> Hi,
>
> Am 21.10.2008 um 18:52 sc
On Oct 21, 2008, at 2:33 PM, Adams, Brian M wrote:
-Original Message-
From: users-boun...@open-mpi.org
[mailto:users-boun...@open-mpi.org] On Behalf Of Ralph Castain
Sent: Tuesday, October 21, 2008 10:53 AM
To: Open MPI Users
Subject: Re: [OMPI users] OpenMPI runtime-specific
environmen
> -Original Message-
> From: users-boun...@open-mpi.org
> [mailto:users-boun...@open-mpi.org] On Behalf Of Ralph Castain
> Sent: Tuesday, October 21, 2008 10:53 AM
> To: Open MPI Users
> Subject: Re: [OMPI users] OpenMPI runtime-specific
> environment variable?
>
>
> On Oct 21, 2008, at 10:
On Oct 21, 2008, at 11:35 AM, Reuti wrote:
Hi,
Am 21.10.2008 um 18:52 schrieb Ralph Castain:
On Oct 21, 2008, at 10:37 AM, Adams, Brian M wrote:
We do have some environmental variables that we guarantee to
be "stable" across releases. You could look for
OMPI_COMM_WORLD_SIZE, or OMPI_UNIVER
Hi,
Am 21.10.2008 um 18:52 schrieb Ralph Castain:
On Oct 21, 2008, at 10:37 AM, Adams, Brian M wrote:
We do have some environmental variables that we guarantee to
be "stable" across releases. You could look for
OMPI_COMM_WORLD_SIZE, or OMPI_UNIVERSE_SIZE (there are a
couple of others as well,
On Oct 21, 2008, at 10:37 AM, Adams, Brian M wrote:
We do have some environmental variables that we guarantee to
be "stable" across releases. You could look for
OMPI_COMM_WORLD_SIZE, or OMPI_UNIVERSE_SIZE (there are a
couple of others as well, but any of these would do).
Q: I just wrote a sim
Thank you Doug, Ralph, and Mattijs for the helpful input. Some replies to
Ralph's message and a question inlined here. -- Brian
> -Original Message-
> From: users-boun...@open-mpi.org
> [mailto:users-boun...@open-mpi.org] On Behalf Of Ralph Castain
> Sent: Monday, October 20, 2008 5:38 P
I have built the release candidate for ga-4.1 with OpenMPI 1.2.3 and
portland compilers 7.0.2 for Myrinet mx.
Running the test.x for 3 Myrinet nodes each with 4 cores I get the
following error messages:
warning:regcache incompatible with malloc
libibverbs: Fatal: couldn't read uverbs ABI version
Pedro,
Please send me the first 60 lines of an F06 file ( for a serial run ).
I need to know
the specific version you have.
Can you also please send me the output from "ls -l
/msc/nastran/msc2007/linux864"
Also, you might want to e-mail me directly, since this issue is more
application
I tried that configuration but I got a nastran error. It seems it hasn't got
an analyzer for open proc.
a.solver=/msc/nastran/msc2007/linux8664/analysis.dmp.open
This file does not exist.
How can I get it?
On 10/20/08, Joe Griffin wrote:
>
> Pedro,
>
>
>
> If you used "openmpi=yes" then
Most message passing libraries that I've worked with use command line
arguments (that get filtered out in MPI_Init) to pass in information to the
started jobs. You could have a check for 'strange' command line arguments.
Mattijs
On Monday 20 October 2008 23:40, Adams, Brian M wrote:
> I work on
14 matches
Mail list logo