Hello Everyone,
This is my 1st post in open-mpi forum.
I am trying to run a simple program which does Sendrecv between two nodes
having 2 interface cards on each of two nodes.
Both the nodes are running RHEL6, with open-mpi 1.4.4 on a 8 core Xeon
processor.
What I noticed was that when using two o
contrib/platform/lanl/cray-xe6 is within the Open MPI trunk. It contains
platform files for an optimized and a debug build (for configure --platform).
-Nathan
On Thu, 16 Feb 2012, Abhinav Sarje wrote:
Hi again,
Could you tell me the complete path of the svn repo? I think I am
missing somethi
Hi yet again,
Nathan, nevermind my previous email. I found the svn location. I will
give it a try now.
Thanks!
Abhinav.
On Thu, Feb 16, 2012 at 12:00 PM, Abhinav Sarje wrote:
> Hi again,
>
> Could you tell me the complete path of the svn repo? I think I am
> missing something there. Also, can I
Hi again,
Could you tell me the complete path of the svn repo? I think I am
missing something there. Also, can I checkout this version
anonymously, or do I need a dev account?
Thanks,
Abhinav.
On Thu, Feb 16, 2012 at 11:52 AM, Abhinav Sarje wrote:
> Hi Nathan,
>
> I had earlier tried to compile
Hi Nathan,
I had earlier tried to compile OpenMPI with just PGI compilers
(without cray wrapper), but with that my code was not able to run on
the compute nodes of the cray cluster (it just ran on the MOM node).
Therefore I have been trying to compiler OpenMPI with the cray
wrappers.
I will check
Abhinav, you shouldn't be using the cray wrappers to build Open MPI or anything
linked against Open MPI. The Cray wrappers will automatically include lots of
stuff you don't want. Use pgcc, pgcc, or icc directly. You shouldn't have any
trouble running in parallel with either aprun or mpirun (or
On Feb 16, 2012, at 9:09 AM, ya...@adina.com wrote:
> (2) Solution to this issue:
>
> You may set the $TMPDIR to a same directory on the same host if
> possible; or you could setenv OMPI_PREFIX_ENV to a common
> directory for MPI processes on the same host while keeping your
> $TMPDIR setting.
On 02/15/2012 07:44 AM, Reuti wrote:
> Hi,
>
> Am 15.02.2012 um 03:48 schrieb alexalex43210:
>
>> But I am a novice for the parallel computation, I often use Fortran to
>> compile my program, now I want to use the Parallel, can you give me some
>> help how to begin?
>> PS: I learned about OP
OK, with Jeff's kind help, I solved this issue in a very simple way.
Now I would like to report back the reason for this issue and the
solution.
(1) The scenario under which this issue happened:
In my OPMI environment, the $TMPDIR envar is set to different
scratch directory for different MPI
Brian McNally writes:
> Hi Dave,
>
> I looked through the INSTALL, VERSION, NEWS, and README files in the
> 1.5.4 openmpi tarball but didn't see what you were referring to.
I can't access the web site, but there's an item in the notes on the
download page about the bug. It must also be in the m
10 matches
Mail list logo