[OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Tomislav Maric
Hello everyone,

I'm trying to run an OpenFOAM simulation on two hosts on my home LAN.
I've managed to make the hosts communicate via ssh without giving
passwords as instructed on Open MPI web page.

The problem with running mpirun is in the enviromental variables for the
non interactive login bash on the slave node. How
exactly can I tell bash where to look for the binaries and headers of
OpenFOAM and Open MPI, or to be precise, set PATH and LD_LIBRARY_PATH so
that mpirun
will work without "orted command not found" or similar happening.

I've been reading the instructions on the web site and I've copied the
commands that set my enviromental variables in all necessary files (.bashrc,
.bash_profile ...), but nothing has changed.

My excuse is that I'm a Mechanical Engineering student. :)


Thank you in advance for your understanding and help,

Tomislav







Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Prasadcse Perera
Hi,
One workaround is you can define PATH and LD_LIBRARY_PATH in your common
.bashrc and have a resembling  paths of installation in two nodes. This
works for me nicely with my three node installation :).

On Sat, Aug 1, 2009 at 5:25 PM, Tomislav Maric wrote:

> Hello everyone,
>
> I'm trying to run an OpenFOAM simulation on two hosts on my home LAN.
> I've managed to make the hosts communicate via ssh without giving
> passwords as instructed on Open MPI web page.
>
> The problem with running mpirun is in the enviromental variables for the
> non interactive login bash on the slave node. How
> exactly can I tell bash where to look for the binaries and headers of
> OpenFOAM and Open MPI, or to be precise, set PATH and LD_LIBRARY_PATH so
> that mpirun
> will work without "orted command not found" or similar happening.
>
> I've been reading the instructions on the web site and I've copied the
> commands that set my enviromental variables in all necessary files
> (.bashrc,
> .bash_profile ...), but nothing has changed.
>
> My excuse is that I'm a Mechanical Engineering student. :)
>
>
> Thank you in advance for your understanding and help,
>
> Tomislav
>
>
>
>
>
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>



-- 
http://www.codeproject.com/script/Articles/MemberArticles.aspx?amid=3489381


Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Tomislav Maric
Prasadcse Perera wrote:
> Hi,
> One workaround is you can define PATH and LD_LIBRARY_PATH in your common
> .bashrc and have a resembling  paths of installation in two nodes. This
> works for me nicely with my three node installation :).
> 

Thank you very much for the advice. Actually I'm running OpenFOAM (read:
a program parallelized to run with Open MPI) from SLAX Live DVD, so the
installation paths are identical, as well as everything else.

I've added commands that set enviromental variables in .bashrc on both
nodes, but you mention "common .bashrc". Common in what way? I'm sorry
for newbish question, again, I'm supposed to be a Mechanical Engineer.
:

OpenFOAM toolkit carries a separate directory for third-party support
software. In this directory there are programs for postprocessing
simulation results and analyze data and Open MPI. Therefore, in my case,
Open MPI is built in a separate directory and the build is automated.

After the build of both programs, there is a special bashrc located in

some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/

that sets all the variables needed to use Open FOAM, such as
FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.

I've tried this installation on the Live DVD on my laptop with two
cores, decomposed the case and ran the simulation in parallel without a
problem.

I hope this information is more helpful.

Best regards,
Tomislav



Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Dominik Táborský
Hi Tomislav,

I also had this issue. When you try to trace it, you'll find out that
when you manually connect to a machine and immediately execute a
command, it will inherit your environment, not the environment of the
node. See:
$ ssh node1 && echo $PATH

This will echo the PATH on your computer, the master one, not the node.
But if you do this:
$ ssh node1
node1$ echo $PATH

it will echo the PATH on your node.
Solution to this is to write the path to the executables and path to
libraries to the variables you have set on your own computer, tha
master.

Let me know how that works for you!

Dr. Eddy


Tomislav Maric píše v Ne 02. 08. 2009 v 13:09 +0200:
> Prasadcse Perera wrote:
> > Hi,
> > One workaround is you can define PATH and LD_LIBRARY_PATH in your common
> > .bashrc and have a resembling  paths of installation in two nodes. This
> > works for me nicely with my three node installation :).
> > 
> 
> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
> installation paths are identical, as well as everything else.
> 
> I've added commands that set enviromental variables in .bashrc on both
> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
> for newbish question, again, I'm supposed to be a Mechanical Engineer.
> :
> 
> OpenFOAM toolkit carries a separate directory for third-party support
> software. In this directory there are programs for postprocessing
> simulation results and analyze data and Open MPI. Therefore, in my case,
> Open MPI is built in a separate directory and the build is automated.
> 
> After the build of both programs, there is a special bashrc located in
> 
> some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/
> 
> that sets all the variables needed to use Open FOAM, such as
> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
> 
> I've tried this installation on the Live DVD on my laptop with two
> cores, decomposed the case and ran the simulation in parallel without a
> problem.
> 
> I hope this information is more helpful.
> 
> Best regards,
> Tomislav
> 
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users



Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Prasadcse Perera
Hi,
common bashrc meant if the /home is network mounted so ignore that I guess.
Have you tried  adding
 . $HOME/OpenFOAM/OpenFOAM-1.5.x/etc/bashrc  to your ~/.bashrc on nodes ?
This will append the configurations you need from the bashrc file located
inside the directory.

Prasad.

On Sat, Aug 1, 2009 at 11:09 PM, Tomislav Maric wrote:

> Prasadcse Perera wrote:
> > Hi,
> > One workaround is you can define PATH and LD_LIBRARY_PATH in your common
> > .bashrc and have a resembling  paths of installation in two nodes. This
> > works for me nicely with my three node installation :).
> >
>
> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
> installation paths are identical, as well as everything else.
>
> I've added commands that set enviromental variables in .bashrc on both
> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
> for newbish question, again, I'm supposed to be a Mechanical Engineer.
> :
>
> OpenFOAM toolkit carries a separate directory for third-party support
> software. In this directory there are programs for postprocessing
> simulation results and analyze data and Open MPI. Therefore, in my case,
> Open MPI is built in a separate directory and the build is automated.
>
> After the build of both programs, there is a special bashrc located in
>
> some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/
>
> that sets all the variables needed to use Open FOAM, such as
> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
>
> I've tried this installation on the Live DVD on my laptop with two
> cores, decomposed the case and ran the simulation in parallel without a
> problem.
>
> I hope this information is more helpful.
>
> Best regards,
> Tomislav
>
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users
>



-- 
http://www.codeproject.com/script/Articles/MemberArticles.aspx?amid=3489381


Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Prasadcse Perera
Hi, $HOME means your installation directory (some/path as you mentioned).

On Sun, Aug 2, 2009 at 12:07 AM, Prasadcse Perera wrote:

> Hi,
> common bashrc meant if the /home is network mounted so ignore that I guess.
> Have you tried  adding
>  . $HOME/OpenFOAM/OpenFOAM-1.5.x/etc/bashrc  to your ~/.bashrc on nodes ?
> This will append the configurations you need from the bashrc file located
> inside the directory.
>
> Prasad.
>
>
> On Sat, Aug 1, 2009 at 11:09 PM, Tomislav Maric wrote:
>
>> Prasadcse Perera wrote:
>> > Hi,
>> > One workaround is you can define PATH and LD_LIBRARY_PATH in your common
>> > .bashrc and have a resembling  paths of installation in two nodes. This
>> > works for me nicely with my three node installation :).
>> >
>>
>> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
>> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
>> installation paths are identical, as well as everything else.
>>
>> I've added commands that set enviromental variables in .bashrc on both
>> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
>> for newbish question, again, I'm supposed to be a Mechanical Engineer.
>> :
>>
>> OpenFOAM toolkit carries a separate directory for third-party support
>> software. In this directory there are programs for postprocessing
>> simulation results and analyze data and Open MPI. Therefore, in my case,
>> Open MPI is built in a separate directory and the build is automated.
>>
>> After the build of both programs, there is a special bashrc located in
>>
>> some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/
>>
>> that sets all the variables needed to use Open FOAM, such as
>> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
>> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
>> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
>>
>> I've tried this installation on the Live DVD on my laptop with two
>> cores, decomposed the case and ran the simulation in parallel without a
>> problem.
>>
>> I hope this information is more helpful.
>>
>> Best regards,
>> Tomislav
>>
>> ___
>> users mailing list
>> us...@open-mpi.org
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>
>
>
> --
> http://www.codeproject.com/script/Articles/MemberArticles.aspx?amid=3489381
>



-- 
http://www.codeproject.com/script/Articles/MemberArticles.aspx?amid=3489381


Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Tomislav Maric
Dominik Táborský wrote:
> Hi Tomislav,
> 
> I also had this issue. When you try to trace it, you'll find out that
> when you manually connect to a machine and immediately execute a
> command, it will inherit your environment, not the environment of the
> node. See:
> $ ssh node1 && echo $PATH
> 
> This will echo the PATH on your computer, the master one, not the node.
> But if you do this:
> $ ssh node1
> node1$ echo $PATH
> 
> it will echo the PATH on your node.

I've tried it:

$ ssh node1 && echo $PATH

at first does nothing, leaving me loged in on the node, but when I exit,
it writes out the $PATH on the master node.

ssh node1

slax@node1$ echo $PATH

gives me the path on the slave node1

and

slax@master$ ssh  node1 'echo $PATH'

gives me the reduced path on the slave node. I think that the problem is
exactly the same as the last line - when I execute a bash script, it is
envoked in a non-interactive mode (login mode, because of the ssh), and
maybe some other config file is read instead of .bash_profile or
.bashrc? This reduced PATH and LD_LIBRARY_PATH cause problems for mpirun
to find the right libraries and binaries.

> Solution to this is to write the path to the executables and path to
> libraries to the variables you have set on your own computer, tha
> master.

The master computer already has everything set, because the Live DVD is
configured properly (i ran a test case on dual core - mpirun runs fine
locally), I'm not sure I understand, could you please explain a bit more
- this is all new to me.

Thank you very much for your advice and time!

Best regards,

Tomislav



> 
> Let me know how that works for you!
> 
> Dr. Eddy
> 
> 
> Tomislav Maric píše v Ne 02. 08. 2009 v 13:09 +0200:
>> Prasadcse Perera wrote:
>>> Hi,
>>> One workaround is you can define PATH and LD_LIBRARY_PATH in your common
>>> .bashrc and have a resembling  paths of installation in two nodes. This
>>> works for me nicely with my three node installation :).
>>>
>> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
>> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
>> installation paths are identical, as well as everything else.
>>
>> I've added commands that set enviromental variables in .bashrc on both
>> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
>> for newbish question, again, I'm supposed to be a Mechanical Engineer.
>> :
>>
>> OpenFOAM toolkit carries a separate directory for third-party support
>> software. In this directory there are programs for postprocessing
>> simulation results and analyze data and Open MPI. Therefore, in my case,
>> Open MPI is built in a separate directory and the build is automated.
>>
>> After the build of both programs, there is a special bashrc located in
>>
>> some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/
>>
>> that sets all the variables needed to use Open FOAM, such as
>> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
>> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
>> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
>>
>> I've tried this installation on the Live DVD on my laptop with two
>> cores, decomposed the case and ran the simulation in parallel without a
>> problem.
>>
>> I hope this information is more helpful.
>>
>> Best regards,
>> Tomislav
>>
>> ___
>> users mailing list
>> us...@open-mpi.org
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users



Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Tomislav Maric
Prasadcse Perera wrote:
> Hi,
> common bashrc meant if the /home is network mounted so ignore that I
> guess. Have you tried  adding
>  . $HOME/OpenFOAM/OpenFOAM-1.5.x/etc/bashrc  to your ~/.bashrc on nodes
> ? This will append the configurations you need from the bashrc file
> located inside the directory.

I've done that and still no go. I've tried it before, when I read that
bash reads .bashrc when invoked in non-interactive mode.

I'm not at all certain I understand the way bash works anymore.

Here's what I've been trying:

1) bash runs .bashrc if envoked in non-interactive (--login or no
--login mode)

2) i've added this line to .bashrc on my master node

   echo Hello, I'm your .bashrc file, you're running non-interactive bash


3) bash runs in non-interactive mode when you write and execute a
script, so I've written a script hello.sh

echo Hello!!!

and tried bash hello.sh, but all I hot is "Hello!!!", as output, not
"Hello, I'm your .bashrc file, you're running non-interactive bash".

Why is that?


These are the commands I've tried after setting .bashrc to be to the
word same as .bash_profile (I didn't have .bashrc file before these
changes), and their outputs:

 slax@marija:/OpenFOAM/OpenFOAM-1.5-dev/tutorials/interFoam/mojDambreak$
/OpenFOAM/ThirdParty/openmpi-1.3/platforms/linuxGccDPOpt/bin/mpirun -H
mario -np 2 `which interFoam` -parallel
/OpenFOAM/OpenFOAM-1.5-dev/applications/bin/linuxGccDPOpt/interFoam:
error while loading shared libraries: libinterfaceProperties.so: cannot
open shared object file: No such file or directory
/OpenFOAM/OpenFOAM-1.5-dev/applications/bin/linuxGccDPOpt/interFoam:
error while loading shared libraries: libinterfaceProperties.so: cannot
open shared object file: No such file or directory
slax@marija:/OpenFOAM/OpenFOAM-1.5-dev/tutorials/interFoam/mojDambreak$

I've started mpirun with a full pathname, so that it works like the
--prefix command and translates the installation info to the node. Not
being able to find the dynamically linked shared libraries point to env.
variable that's missing or incorrectly set.

`which interFoam` part is from instructions on how to run OpenFOAM
programs using mpirun found here:


http://www.opencfd.co.uk/openfoam/doc/userse35.html

then I've tried another approach, sending env variables with the -x
option of mpirun:

slax@marija:/OpenFOAM/OpenFOAM-1.5-dev/tutorials/interFoam/mojDambreak$
/OpenFOAM/ThirdParty/openmpi-1.3/platforms/linuxGccDPOpt/bin/mpirun -x
LD_LIBRARY_PATH=$LD_LIBRARY_PATH, -x PATH=$PATH, -H mario -np 2 `which
interFoam` -parallel
--
mpirun noticed that process rank 0 with PID 12121 on node mario exited
on signal 11 (Segmentation fault).
--
slax@marija:/OpenFOAM/OpenFOAM-1.5-dev/tutorials/interFoam/mojDambreak$

and this result gives me a new hint, but where does this hint lead to?

Questions:

Why didn't the copying of .bash_profile contents to .bashrc work at all?

Why did the second type of invocation return a Segmentation Fault?

I can try sending more env. variables via -x option, but they are really
numerous for OpenFOAM.

Any advice? As you all see, I'm really trying, and this is quite heavy
stuff for a newbish mech. engineer. :)))

Thank you Prasadcse Perera for your advice and time!

Best regards,
Tomislav




Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Dominik Táborský
Hi,

This is it:
> slax@master$ ssh  node1 'echo $PATH'
> 
> gives me the reduced path on the slave node.

I'm sorry, I was wrong. You typed it correctly. AFAIK, this command logs
in your node but the PATH variable is still just as on your master. I
had this issue and I solved it by editing the .bashrc file on the
master, NOT the node. That worked for me. Try editing the PATH and
LD_LIBRARY_PATH on the master, on the computer you run the mpirun
command.

So, for example, if you have on the nodes the MPI installation
in /openMPI/, with subfolders "bin" and "lib", try putting these lines
into your .bashrc file on the master:
export PATH=$PATH:/openMPI/bin
export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib

It shouldn't matter where is your MPI installation on the master. The
nodes matter!

Note: I am a openMPI beginner, I am not involved in development, I'm
just sharing my experience on the same problem and how I solved it. No
guarantee...

Dr. Eddy

Tomislav Maric píše v Ne 02. 08. 2009 v 16:09 +0200:
> Dominik Táborský wrote:
> > Hi Tomislav,
> > 
> > I also had this issue. When you try to trace it, you'll find out that
> > when you manually connect to a machine and immediately execute a
> > command, it will inherit your environment, not the environment of the
> > node. See:
> > $ ssh node1 && echo $PATH
> > 
> > This will echo the PATH on your computer, the master one, not the node.
> > But if you do this:
> > $ ssh node1
> > node1$ echo $PATH
> > 
> > it will echo the PATH on your node.
> 
> I've tried it:
> 
> $ ssh node1 && echo $PATH
> 
> at first does nothing, leaving me loged in on the node, but when I exit,
> it writes out the $PATH on the master node.
> 
> ssh node1
> 
> slax@node1$ echo $PATH
> 
> gives me the path on the slave node1
> 
> and
> 
> slax@master$ ssh  node1 'echo $PATH'
> 
> gives me the reduced path on the slave node. I think that the problem is
> exactly the same as the last line - when I execute a bash script, it is
> envoked in a non-interactive mode (login mode, because of the ssh), and
> maybe some other config file is read instead of .bash_profile or
> .bashrc? This reduced PATH and LD_LIBRARY_PATH cause problems for mpirun
> to find the right libraries and binaries.
> 
> > Solution to this is to write the path to the executables and path to
> > libraries to the variables you have set on your own computer, tha
> > master.
> 
> The master computer already has everything set, because the Live DVD is
> configured properly (i ran a test case on dual core - mpirun runs fine
> locally), I'm not sure I understand, could you please explain a bit more
> - this is all new to me.
> 
> Thank you very much for your advice and time!
> 
> Best regards,
> 
> Tomislav
> 
> 
> 
> > 
> > Let me know how that works for you!
> > 
> > Dr. Eddy
> > 
> > 
> > Tomislav Maric píše v Ne 02. 08. 2009 v 13:09 +0200:
> >> Prasadcse Perera wrote:
> >>> Hi,
> >>> One workaround is you can define PATH and LD_LIBRARY_PATH in your common
> >>> .bashrc and have a resembling  paths of installation in two nodes. This
> >>> works for me nicely with my three node installation :).
> >>>
> >> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
> >> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
> >> installation paths are identical, as well as everything else.
> >>
> >> I've added commands that set enviromental variables in .bashrc on both
> >> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
> >> for newbish question, again, I'm supposed to be a Mechanical Engineer.
> >> :
> >>
> >> OpenFOAM toolkit carries a separate directory for third-party support
> >> software. In this directory there are programs for postprocessing
> >> simulation results and analyze data and Open MPI. Therefore, in my case,
> >> Open MPI is built in a separate directory and the build is automated.
> >>
> >> After the build of both programs, there is a special bashrc located in
> >>
> >> some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/
> >>
> >> that sets all the variables needed to use Open FOAM, such as
> >> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
> >> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
> >> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
> >>
> >> I've tried this installation on the Live DVD on my laptop with two
> >> cores, decomposed the case and ran the simulation in parallel without a
> >> problem.
> >>
> >> I hope this information is more helpful.
> >>
> >> Best regards,
> >> Tomislav
> >>
> >> ___
> >> users mailing list
> >> us...@open-mpi.org
> >> http://www.open-mpi.org/mailman/listinfo.cgi/users
> > 
> > ___
> > users mailing list
> > us...@open-mpi.org
> > http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> ___
> users mailing list
> us...@open

Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Tomislav Maric
Dominik Táborský wrote:
> Hi,
> 
> This is it:
>> slax@master$ ssh  node1 'echo $PATH'
>>
>> gives me the reduced path on the slave node.
> 
> I'm sorry, I was wrong. You typed it correctly. AFAIK, this command logs
> in your node but the PATH variable is still just as on your master. I
> had this issue and I solved it by editing the .bashrc file on the
> master, NOT the node. That worked for me. Try editing the PATH and
> LD_LIBRARY_PATH on the master, on the computer you run the mpirun
> command.
> 
> So, for example, if you have on the nodes the MPI installation
> in /openMPI/, with subfolders "bin" and "lib", try putting these lines
> into your .bashrc file on the master:
> export PATH=$PATH:/openMPI/bin
> export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib
> 
> It shouldn't matter where is your MPI installation on the master. The
> nodes matter!
> 
> Note: I am a openMPI beginner, I am not involved in development, I'm
> just sharing my experience on the same problem and how I solved it. No
> guarantee...

I'm really gratefull for your help!

I tried leaving only .bashrc at the master node and I have set the
variables as you have suggested, but nothing changed.

I've even read again in the man pages about ssh invoked bash, and it
realy reads and executes/etc/bash.bashrc file and ~/.bashrc -  which
ever comes first. I've added echo commands to .bashrc on the master, but
nothing is echoed, and

ssh node1 'echo $PATH'

gives the reduced path again. I'm frustrated. I'm a step away from
running OpenFOAM on a LAN over a LiveDVD... :(




Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Dominik Táborský

The .bashrc on your master is not run, therefore there are no echos. 
Let's revise once more so that we're sure we understand each other :-)

On your master - on the computer you run "mpirun" - you put these 2
lines into your own .bashrc:

export PATH=$PATH:/openMPI/bin
export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib

These 2 lines are in /home/tomislav/.bashrc (supposing your user is
tomislav).

On each of your node you can leave .bashrc files. Try putting this line
in them:
echo $PATH

Again, this file is /home/tomislav/.bashrc. It must be the same file. Or
if you don't have that file on your nodes, the system-wide bashrc file
should be run, which is - I think - /etc/bash.bashrc.

Now, when you run the command:
ssh node1 echo $PATH

It should print two lines:
1st should be the executed command which should print out new PATH,
2nd should be the command in the node's .bashrc file, which should be
the reduced PATH.

Anyway, let's try something:
$ echo $PATH
$ ssh node1 echo $PATH
node1$ echo $PATH

This should print out 3 lines, your master PATH twice and then the
reduced PATH once.

Let me know how that went.

Dr. Eddy


Tomislav Maric píše v Ne 02. 08. 2009 v 16:58 +0200:
> Dominik Táborský wrote:
> > Hi,
> > 
> > This is it:
> >> slax@master$ ssh  node1 'echo $PATH'
> >>
> >> gives me the reduced path on the slave node.
> > 
> > I'm sorry, I was wrong. You typed it correctly. AFAIK, this command logs
> > in your node but the PATH variable is still just as on your master. I
> > had this issue and I solved it by editing the .bashrc file on the
> > master, NOT the node. That worked for me. Try editing the PATH and
> > LD_LIBRARY_PATH on the master, on the computer you run the mpirun
> > command.
> > 
> > So, for example, if you have on the nodes the MPI installation
> > in /openMPI/, with subfolders "bin" and "lib", try putting these lines
> > into your .bashrc file on the master:
> > export PATH=$PATH:/openMPI/bin
> > export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib
> > 
> > It shouldn't matter where is your MPI installation on the master. The
> > nodes matter!
> > 
> > Note: I am a openMPI beginner, I am not involved in development, I'm
> > just sharing my experience on the same problem and how I solved it. No
> > guarantee...
> 
> I'm really gratefull for your help!
> 
> I tried leaving only .bashrc at the master node and I have set the
> variables as you have suggested, but nothing changed.
> 
> I've even read again in the man pages about ssh invoked bash, and it
> realy reads and executes/etc/bash.bashrc file and ~/.bashrc -  which
> ever comes first. I've added echo commands to .bashrc on the master, but
> nothing is echoed, and
> 
> ssh node1 'echo $PATH'
> 
> gives the reduced path again. I'm frustrated. I'm a step away from
> running OpenFOAM on a LAN over a LiveDVD... :(
> 
> 
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users



Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Tomislav Maric
Dominik Táborský wrote:
> The .bashrc on your master is not run, therefore there are no echos. 
> Let's revise once more so that we're sure we understand each other :-)
> 
> On your master - on the computer you run "mpirun" - you put these 2
> lines into your own .bashrc:
> 
> export PATH=$PATH:/openMPI/bin
> export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib
> 
> These 2 lines are in /home/tomislav/.bashrc (supposing your user is
> tomislav).
> 
> On each of your node you can leave .bashrc files. Try putting this line
> in them:
> echo $PATH
> 
> Again, this file is /home/tomislav/.bashrc. It must be the same file. Or
> if you don't have that file on your nodes, the system-wide bashrc file
> should be run, which is - I think - /etc/bash.bashrc.
> 
> Now, when you run the command:
> ssh node1 echo $PATH
> 
> It should print two lines:
> 1st should be the executed command which should print out new PATH,
> 2nd should be the command in the node's .bashrc file, which should be
> the reduced PATH.
> 
> Anyway, let's try something:
> $ echo $PATH
> $ ssh node1 echo $PATH
> node1$ echo $PATH
> 
> This should print out 3 lines, your master PATH twice and then the
> reduced PATH once.
> 
> Let me know how that went.
> 
> Dr. Eddy

are you sure this is the right syntax:

ssh node1 echo $PATH

?

ssh node1 echo $PATH gives me only the full PATH and doesn't log me on
to node1.

OK, here's what I've tried:

ssh node1 && echo $PATH

gives me first the $PATH on the master, and then the reduced one, as you
said.


echo $PATH
ssh node1 && echo $PATH

gives me the master $PATH twice, and then the reduced one.

At least now I know that ~/.bashrc is not being called at all. You see,
I've also tried

ssh node1

and if ssh is using non-interactive login bash, it should start a
.bashrc. If it started .bashrc on the master, it would, because of the

echo $PATH

, echo the master $PATH, and if it starts the .bashrc on the slave node,
it would echo the reduced path. Something is not right. I've even
written echo master node and echo slave node in both .bashrc and nothing
echoed.

if i try

ssh mario 'echo $PATH' i get the reduced $PATH still.

One more thing to know, I set ALL environmental variables using
OpenFOAM's bashrc script, both for OpenFOAM and OMPI, but this shouldn't
make any difference because

. $FOAM_INST_DIR/OpenFOAM-1.5-dev/etc/bashrc

script sets all the variables the right way on the master AND slave node
, meaning that I run successful serial simulations on both computers. :)

Any suggestions?




Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Dominik Táborský

Okay, now it's getting more confusing since I just found out that it
somehow stopped working for me!

Anyway, let's find a solution.
I found out that there is difference between 
ssh node1 echo $PATH
and 
ssh node1 'echo $PATH'
These command give you different output. 'man ssh' states that it sets
its own PATH variable right before the user logs in but after the
connection is established. This variable is set during compilation.

I am using dropbear as SSH2 server so I won't be able to guide you if
you're using OpenSSH as a server, but the manpages should be sufficient.
Look into man ssh and man ssh_config. You should create files ~/.ssh/rc
OR ~/.ssh/config OR ~/.ssh/environment. Setting it up in one of these
files should be enough.

I will probably recompile dropbear with new default PATH.

Anyway, I am sure it used to work for me and I have no idea why it
stopped.

If you'd need more help, just ask :-)

Dr. Eddy




Tomislav Maric píše v Ne 02. 08. 2009 v 18:45 +0200:
> Dominik Táborský wrote:
> > The .bashrc on your master is not run, therefore there are no echos. 
> > Let's revise once more so that we're sure we understand each other :-)
> > 
> > On your master - on the computer you run "mpirun" - you put these 2
> > lines into your own .bashrc:
> > 
> > export PATH=$PATH:/openMPI/bin
> > export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib
> > 
> > These 2 lines are in /home/tomislav/.bashrc (supposing your user is
> > tomislav).
> > 
> > On each of your node you can leave .bashrc files. Try putting this line
> > in them:
> > echo $PATH
> > 
> > Again, this file is /home/tomislav/.bashrc. It must be the same file. Or
> > if you don't have that file on your nodes, the system-wide bashrc file
> > should be run, which is - I think - /etc/bash.bashrc.
> > 
> > Now, when you run the command:
> > ssh node1 echo $PATH
> > 
> > It should print two lines:
> > 1st should be the executed command which should print out new PATH,
> > 2nd should be the command in the node's .bashrc file, which should be
> > the reduced PATH.
> > 
> > Anyway, let's try something:
> > $ echo $PATH
> > $ ssh node1 echo $PATH
> > node1$ echo $PATH
> > 
> > This should print out 3 lines, your master PATH twice and then the
> > reduced PATH once.
> > 
> > Let me know how that went.
> > 
> > Dr. Eddy
> 
> are you sure this is the right syntax:
> 
> ssh node1 echo $PATH
> 
> ?
> 
> ssh node1 echo $PATH gives me only the full PATH and doesn't log me on
> to node1.
> 
> OK, here's what I've tried:
> 
> ssh node1 && echo $PATH
> 
> gives me first the $PATH on the master, and then the reduced one, as you
> said.
> 
> 
> echo $PATH
> ssh node1 && echo $PATH
> 
> gives me the master $PATH twice, and then the reduced one.
> 
> At least now I know that ~/.bashrc is not being called at all. You see,
> I've also tried
> 
> ssh node1
> 
> and if ssh is using non-interactive login bash, it should start a
> .bashrc. If it started .bashrc on the master, it would, because of the
> 
> echo $PATH
> 
> , echo the master $PATH, and if it starts the .bashrc on the slave node,
> it would echo the reduced path. Something is not right. I've even
> written echo master node and echo slave node in both .bashrc and nothing
> echoed.
> 
> if i try
> 
> ssh mario 'echo $PATH' i get the reduced $PATH still.
> 
> One more thing to know, I set ALL environmental variables using
> OpenFOAM's bashrc script, both for OpenFOAM and OMPI, but this shouldn't
> make any difference because
> 
> . $FOAM_INST_DIR/OpenFOAM-1.5-dev/etc/bashrc
> 
> script sets all the variables the right way on the master AND slave node
> , meaning that I run successful serial simulations on both computers. :)
> 
> Any suggestions?
> 
> 
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users



Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Dorian Krause

Hi,

Dominik Táborský wrote:

Okay, now it's getting more confusing since I just found out that it
somehow stopped working for me!

Anyway, let's find a solution.
I found out that there is difference between 
ssh node1 echo $PATH
  


In this case the $PATH variable is expanded by the shell *before* the 
ssh executable is called


and 
ssh node1 'echo $PATH'
  


Here path is expanded on node1


These command give you different output. 'man ssh' states that it sets
its own PATH variable right before the user logs in but after the
connection is established. This variable is set during compilation.

I am using dropbear as SSH2 server so I won't be able to guide you if
you're using OpenSSH as a server, but the manpages should be sufficient.
Look into man ssh and man ssh_config. You should create files ~/.ssh/rc
OR ~/.ssh/config OR ~/.ssh/environment. Setting it up in one of these
files should be enough.

I will probably recompile dropbear with new default PATH.

Anyway, I am sure it used to work for me and I have no idea why it
stopped.

If you'd need more help, just ask :-)

Dr. Eddy




Tomislav Maric píše v Ne 02. 08. 2009 v 18:45 +0200:
  

Dominik Táborský wrote:

The .bashrc on your master is not run, therefore there are no echos. 
Let's revise once more so that we're sure we understand each other :-)


On your master - on the computer you run "mpirun" - you put these 2
lines into your own .bashrc:

export PATH=$PATH:/openMPI/bin
export LD_RUN_FLAG=$LD_LIBRARY_PATH:/openMPI/lib

These 2 lines are in /home/tomislav/.bashrc (supposing your user is
tomislav).

On each of your node you can leave .bashrc files. Try putting this line
in them:
echo $PATH

Again, this file is /home/tomislav/.bashrc. It must be the same file. Or
if you don't have that file on your nodes, the system-wide bashrc file
should be run, which is - I think - /etc/bash.bashrc.

Now, when you run the command:
ssh node1 echo $PATH

It should print two lines:
1st should be the executed command which should print out new PATH,
2nd should be the command in the node's .bashrc file, which should be
the reduced PATH.

Anyway, let's try something:
$ echo $PATH
$ ssh node1 echo $PATH
node1$ echo $PATH

This should print out 3 lines, your master PATH twice and then the
reduced PATH once.

Let me know how that went.

Dr. Eddy
  

are you sure this is the right syntax:

ssh node1 echo $PATH

?

ssh node1 echo $PATH gives me only the full PATH and doesn't log me on
to node1.

OK, here's what I've tried:

ssh node1 && echo $PATH

gives me first the $PATH on the master, and then the reduced one, as you
said.


echo $PATH
ssh node1 && echo $PATH

gives me the master $PATH twice, and then the reduced one.

At least now I know that ~/.bashrc is not being called at all. You see,
I've also tried

ssh node1

and if ssh is using non-interactive login bash, it should start a
.bashrc. If it started .bashrc on the master, it would, because of the

echo $PATH

, echo the master $PATH, and if it starts the .bashrc on the slave node,
it would echo the reduced path. Something is not right. I've even
written echo master node and echo slave node in both .bashrc and nothing
echoed.

if i try

ssh mario 'echo $PATH' i get the reduced $PATH still.

One more thing to know, I set ALL environmental variables using
OpenFOAM's bashrc script, both for OpenFOAM and OMPI, but this shouldn't
make any difference because

. $FOAM_INST_DIR/OpenFOAM-1.5-dev/etc/bashrc

script sets all the variables the right way on the master AND slave node
, meaning that I run successful serial simulations on both computers. :)

Any suggestions?


___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users



___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users




Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Tomislav Maric
Dominik Táborský wrote:
> Okay, now it's getting more confusing since I just found out that it
> somehow stopped working for me!
> 
> Anyway, let's find a solution.
> I found out that there is difference between 
> ssh node1 echo $PATH
> and 
> ssh node1 'echo $PATH'
> These command give you different output. 'man ssh' states that it sets
> its own PATH variable right before the user logs in but after the
> connection is established. This variable is set during compilation.
> 
> I am using dropbear as SSH2 server so I won't be able to guide you if
> you're using OpenSSH as a server, but the manpages should be sufficient.
> Look into man ssh and man ssh_config. You should create files ~/.ssh/rc
> OR ~/.ssh/config OR ~/.ssh/environment. Setting it up in one of these
> files should be enough.
> 
> I will probably recompile dropbear with new default PATH.
> 
> Anyway, I am sure it used to work for me and I have no idea why it
> stopped.
> 
> If you'd need more help, just ask :-)
> 
> Dr. Eddy
> 

Thank you very much!! I'm also finding out about those files and I'm
using OpenSSH. I'll try and configure it to work. The weirdest thing is
that people who use Ubuntu on OpenFOAM forum just had to comment a line
in .bashrc that returns if the bash is run in non-interactive mode.

I just don't get it. Let me ask you just one thing, before the next 5-6
hours of fighting with config files:

what about NFS?

What if I export the directory? On OMPI pages is written that nfs
simplifies things. I'm noob in networking so I don't know if this would
benefit me.

If I edit ~/.ssh/environment then I have to manually set VARIABLE=VALUE,
and there are dozens of variables to set. I think I'll try the rc file
first.

Thank you again!

Best regards,
Tomislav


Re: [OMPI users] Open MPI and env. variables (LD_LIBRARY_PATH and PATH) - complete and utter Open MPI / Linux noob

2009-08-02 Thread Dominik Táborský
I'm sorry, I can't help you with NFS. I have never had it on my network.

Good luck anyway... :)


Tomislav Maric píše v Ne 02. 08. 2009 v 20:18 +0200:
> Dominik Táborský wrote:
> > Okay, now it's getting more confusing since I just found out that it
> > somehow stopped working for me!
> > 
> > Anyway, let's find a solution.
> > I found out that there is difference between 
> > ssh node1 echo $PATH
> > and 
> > ssh node1 'echo $PATH'
> > These command give you different output. 'man ssh' states that it sets
> > its own PATH variable right before the user logs in but after the
> > connection is established. This variable is set during compilation.
> > 
> > I am using dropbear as SSH2 server so I won't be able to guide you if
> > you're using OpenSSH as a server, but the manpages should be sufficient.
> > Look into man ssh and man ssh_config. You should create files ~/.ssh/rc
> > OR ~/.ssh/config OR ~/.ssh/environment. Setting it up in one of these
> > files should be enough.
> > 
> > I will probably recompile dropbear with new default PATH.
> > 
> > Anyway, I am sure it used to work for me and I have no idea why it
> > stopped.
> > 
> > If you'd need more help, just ask :-)
> > 
> > Dr. Eddy
> > 
> 
> Thank you very much!! I'm also finding out about those files and I'm
> using OpenSSH. I'll try and configure it to work. The weirdest thing is
> that people who use Ubuntu on OpenFOAM forum just had to comment a line
> in .bashrc that returns if the bash is run in non-interactive mode.
> 
> I just don't get it. Let me ask you just one thing, before the next 5-6
> hours of fighting with config files:
> 
> what about NFS?
> 
> What if I export the directory? On OMPI pages is written that nfs
> simplifies things. I'm noob in networking so I don't know if this would
> benefit me.
> 
> If I edit ~/.ssh/environment then I have to manually set VARIABLE=VALUE,
> and there are dozens of variables to set. I think I'll try the rc file
> first.
> 
> Thank you again!
> 
> Best regards,
> Tomislav
> ___
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users