Hi all,
I've inherited a MPI code that was written ~8-10 years ago and it predominately
uses MPI persistent communication routines for data transfers e.g.
MPI_SEND_INIT, MPI_RECV_INIT, MPI_START etc. I was just wondering if using
persistent communication calls is still the most efficient/scala
Another option is to leave iptables enabled, but allow TCP sockets with random
source/destination port numbers between trusted machines.
On Mar 25, 2013, at 10:21 AM, Ralph Castain wrote:
>
> On Mar 25, 2013, at 7:13 AM, Baptiste Robert
> wrote:
>
>> Yes, I read that we don't have the cho
On Mar 25, 2013, at 7:13 AM, Baptiste Robert wrote:
> Yes, I read that we don't have the choice, we have to disable iptables. This
> information is not crystal clear in the user manual.
> Furthermore this mean that we can't launch mpi on a remote web server in
> total security.
Well, you coul
Yes, I read that we don't have the choice, we have to disable iptables.
This information is not crystal clear in the user manual.
Furthermore this mean that we can't launch mpi on a remote web server in
total security.
Thank you for your help, I really appreciate it.
2013/3/25 Ralph Castain
>
On Mar 25, 2013, at 3:26 AM, Baptiste Robert wrote:
> Hi.
>
> Thank you very much for your answer. I've disabled iptables on both computers
> and then... work like a charm. But here's come my next question, what are the
> ports that the daemon use ? Because I haven't set iptables, it's by def
Hello,
we observe the following divide-by-zero error:
[linuxscc005:31416] *** Process received signal ***
[linuxscc005:31416] Signal: Floating point exception (8)
[linuxscc005:31416] Signal code: Integer divide-by-zero (1)
[linuxscc005:31416] Failing at address: 0x2282db
[linuxscc005:31416] [ 0]
Hi.
Thank you very much for your answer. I've disabled iptables on both
computers and then... work like a charm. But here's come my next question,
what are the ports that the daemon use ? Because I haven't set iptables,
it's by default and I don't understand why is filtered.
2013/3/25 Ralph Cas
Hi
I reported the following error for openmpi-1.9r28203 and did some research
in the meantime.
> Solaris 10, x86_64 and sparc, SunC 5.12, 32-bit and 64-bit
> --
>
> sunpc1 openmpi-1.9-SunOS.x86_64.32_cc 16 tail -15 log.make.SunOS.x86_64.32_