Lee Amy wrote:
> Hi,
>
> I run some programs by using OpenMPI 1.3.3 and when I execute the
> command I encountered such following error messages.
>
> sh: orted: command not found
> --
> A daemon (pid 6797) died unexpectedly w
I to use the --prefix behavior by
> default if you configure/build Open MPI with the --enable-mpirun-
> prefix-by-default configure switch.
>
> Hope that helps!
>
>
>
> On Aug 3, 2009, at 7:35 AM, Tomislav Maric wrote:
>
>> Thank you Dominik for all your
David Doria wrote:
> I have three machines: mine (daviddoria) and two identical remote
> machines (cloud3 and cloud6). I can password-less ssh between any pair.
> The machines are all 32bit running Fedora 11. OpenMPI was installed
> identically on each. The .bashrc is identical on each. /etc/hosts
ronment
edit sshd_config on the other host and set the same things.
Works like a charm!
Tomislav
Dominik Táborský wrote:
> I'm sorry, I can't help you with NFS. I have never had it on my network.
>
> Good luck anyway... :)
>
>
> Tomislav Maric píše v Ne 02. 08. 2009
Dominik Táborský wrote:
> Okay, now it's getting more confusing since I just found out that it
> somehow stopped working for me!
>
> Anyway, let's find a solution.
> I found out that there is difference between
> ssh node1 echo $PATH
> and
> ssh node1 'echo $PATH'
> These command give you differ
Dominik Táborský wrote:
> The .bashrc on your master is not run, therefore there are no echos.
> Let's revise once more so that we're sure we understand each other :-)
>
> On your master - on the computer you run "mpirun" - you put these 2
> lines into your own .bashrc:
>
> export PATH=$PATH:/op
Dominik Táborský wrote:
> Hi,
>
> This is it:
>> slax@master$ ssh node1 'echo $PATH'
>>
>> gives me the reduced path on the slave node.
>
> I'm sorry, I was wrong. You typed it correctly. AFAIK, this command logs
> in your node but the PATH variable is still just as on your master. I
> had this
Prasadcse Perera wrote:
> Hi,
> common bashrc meant if the /home is network mounted so ignore that I
> guess. Have you tried adding
> . $HOME/OpenFOAM/OpenFOAM-1.5.x/etc/bashrc to your ~/.bashrc on nodes
> ? This will append the configurations you need from the bashrc file
> located inside the d
l core - mpirun runs fine
locally), I'm not sure I understand, could you please explain a bit more
- this is all new to me.
Thank you very much for your advice and time!
Best regards,
Tomislav
>
> Let me know how that works for you!
>
> Dr. Eddy
>
>
> Tomislav Ma
Prasadcse Perera wrote:
> Hi,
> One workaround is you can define PATH and LD_LIBRARY_PATH in your common
> .bashrc and have a resembling paths of installation in two nodes. This
> works for me nicely with my three node installation :).
>
Thank you very much for the advice. Actually I'm running O
Hello everyone,
I'm trying to run an OpenFOAM simulation on two hosts on my home LAN.
I've managed to make the hosts communicate via ssh without giving
passwords as instructed on Open MPI web page.
The problem with running mpirun is in the enviromental variables for the
non interactive login bash
11 matches
Mail list logo