I have installed OpenMPI-2.0.0 in 2 systems with IP addresses 172.16.5.33
and 172.16.5.32. I have compiled the hello_c.c and hello_oshmem_c.c files
which are in the examples directory. The respective object files are
hello_c and hello_oshmem_c. When I am executing hello_c it is running fine,
but wh
Dear Sir,
I am really thankful for your help and I will definitely try the latest
version to see whether the problem is resolved or not.
Thanks.
On Tue, Aug 2, 2016 at 8:13 PM, Jeff Squyres (jsquyres)
wrote:
> Debendra --
>
> Can you try the latest v2.0.1 nightly snapshot tarball and see if t
I tried with the latest v2.0.1 nightly snapshot tarball, but still the
problem exists.
Thanks.
On Tue, Aug 2, 2016 at 9:34 PM, Debendra Das
wrote:
> Dear Sir,
>
> I am really thankful for your help and I will definitely try the latest
> version to see whether the problem is res
I have installed OpenMPI-2.0.0 in 5 systems with IP addresses 172.16.5.29,
172.16.5.30, 172.16.5.31, 172.16.5.32, 172.16.5.33.While executing the
hello_oshmem_c.c program (under the examples directory) , correct output is
coming only when executing is done using 2 distributed machines.But error
is
ing Word attachments on
> mailing lists. I’d suggest sending this to us as plain text if you want us
> to read it.
>
>
> > On Aug 12, 2016, at 4:03 AM, Debendra Das
> wrote:
> >
> > I have installed OpenMPI-2.0.0 in 5 systems with IP addresses
> 172.16.5.29
for both the report and posting the logs in a plain text file.
>
>
> i opened https://github.com/open-mpi/ompi/issues/1966 to track this issue,
>
> it contains a patch that fixes/works around this issue.
>
>
> Cheers,
>
>
> Gilles
>
> On 8/14/2016 7:39 PM, Debend
nd network, an other option is to install
> mxm (mellanox proprietary but free library) and rebuild Open MPI.
> > pml/yalla will be used instead of ob1 and you should be just fine
> >
> > Cheers,
> >
> > Gilles
> >
> > On Tuesday, August 16, 2016, Jeff Squyre