Andreas Schäfer schrieb:
On 19:31 Tue 08 Jan , Dino Rossegger wrote:
Hi,
thanks for the programm, but sadly I can't get it work :(.
It's the same error as in my programm. I get the following output:
0
0
0
10
0
0
11
0
0
Which as far as I know can't be correct.
Oh, my
right data in
it's array, all the others only 0.
Andreas Schäfer schrieb:
Hi Dino,
On 18:05 Tue 08 Jan , Dino Rossegger wrote:
In fact it is initialized, as I stated in my first mail I only left out
the code where it gets initialized, since it reads the data from a file
and tha
George Bosilca schrieb:
On Jan 8, 2008, at 11:14 AM, Dino Rossegger wrote:
If so, then the problem is that Scatter actually gets an array of
pointers
and sends these pointers trying to interpret them as doubles.
You either have to use several scatter commands or "fold" your
2D-Ar
l try the folding,
maybe this will help.
Thanks
Hope this helps
Jody
On Jan 8, 2008 3:54 PM, Dino Rossegger wrote:
Hi,
I have a problem distributing a 2 dimensional array over 3 processes.
I tried different methods to distribute the data (Broadcast, Send/Recv,
Scatter) but all of them did
Hi,
I have a problem distributing a 2 dimensional array over 3 processes.
I tried different methods to distribute the data (Broadcast, Send/Recv,
Scatter) but all of them didn't work for me. The output of the root
processor (0 in my case) is always okay, the output of the others are
simple 0.
T
st be sure to get
> rid of all of the old installs of Open MPI from all your nodes, then
> reinstall and try again.
>
> Tim
>
> Dino Rossegger wrote:
>> Here the Syntax & Output of the Command:
>> root@sun:~# mpirun --hostfile hostfile saturn
>> [s
t;
> Can you try:
> mpirun --hostfile hostfile hostname
>
> Thanks,
>
> Tim
>
> Dino Rossegger wrote:
>> Hi again,
>>
>> Tim Prins schrieb:
>>> Hi,
>>>
>>> On Monday 01 October 2007 03:56:16 pm Dino Rossegger wrote:
>>&g
Hi again,
Tim Prins schrieb:
> Hi,
>
> On Monday 01 October 2007 03:56:16 pm Dino Rossegger wrote:
>> Hi again,
>>
>> Yes the error output is the same:
>> root@sun:~# mpirun --hostfile hostfile main
>> [sun:23748] [0,0,0] ORTE_ERROR_LOG: Timeout in file
&g
he --prefix option:
>
> $mpirun -np 2 --prefix /opt/openmpi -H sun,saturn ./main
>
> (assuming your Open MPI installation lies in /opt/openmpi
> on both machines)
>
>
> Jody
>
> On 10/1/07, Dino Rossegger wrote:
>> Hi Jodi,
>> did the steps as you said, but
nvironment on the client and setting
> PermitUserEnvironment yes
> in /etc/ssh/sshd_config on the server (for this you need root
> prioviledge though)
>
> To be on the safe side, i did both on all my nodes
>
> Jody
>
> On 9/27/07, Dino Rossegger wrote:
>> Hi Jody,
>
vironment variables are when
> ssh is run without a shell.
>
>
> On 9/27/07, Dino Rossegger wrote:
>> Hi,
>>
>> I have a problem running a simple programm mpihello.cpp.
>>
>> Here is a excerp of the error and the command
>> root@sun:~# mpirun -H
Hi,
I have a problem running a simple programm mpihello.cpp.
Here is a excerp of the error and the command
root@sun:~# mpirun -H sun,saturn main
[sun:25213] [0,0,0] ORTE_ERROR_LOG: Timeout in file
base/pls_base_orted_cmds.c at line 275
[sun:25213] [0,0,0] ORTE_ERROR_LOG: Timeout in file pls_rsh_m
12 matches
Mail list logo