ur issue neither with the latest 1.8 nor with the
> trunk. I tried using a single host, while forcing SM and then TP to no
> avail.
>
> Can you try restricting the collective modules in use (adding --mca coll
> tuned,basic) to your mpirun command?
>
> George.
>
>
unicating between processes via networking.) If it's on a single
> host, then it might be an issue with shared memory.
>
> Josh
>
> On Fri, Feb 20, 2015 at 1:51 AM, Sachin Krishnan
> wrote:
>
>> Hello Josh,
>>
>> The command i use to compile the cod
MCA vprotocol: pessimist (MCA v2.0, API v2.0, Component v1.8.4)
Sachin
>Sachin,
>Can you, please, provide a command line? Additional information about your
>system could be helpful also.
>Josh
>>On Wed, Feb 18, 2015 at 3:43 AM, Sachin Krishnan
wrote:
>> Hello,
Hello,
I am new to MPI and also this list.
I wrote an MPI code with several MPI_Bcast calls in a loop. My code was
getting stuck at random points, ie it was not systematic. After a few hours
of debugging and googling, I found that the issue may be with the several
MPI_Bcast calls in a loop.
I stu