Hi Tomislav, I also had this issue. When you try to trace it, you'll find out that when you manually connect to a machine and immediately execute a command, it will inherit your environment, not the environment of the node. See: $ ssh node1 && echo $PATH
This will echo the PATH on your computer, the master one, not the node. But if you do this: $ ssh node1 node1$ echo $PATH it will echo the PATH on your node. Solution to this is to write the path to the executables and path to libraries to the variables you have set on your own computer, tha master. Let me know how that works for you! Dr. Eddy Tomislav Maric píše v Ne 02. 08. 2009 v 13:09 +0200: > Prasadcse Perera wrote: > > Hi, > > One workaround is you can define PATH and LD_LIBRARY_PATH in your common > > .bashrc and have a resembling paths of installation in two nodes. This > > works for me nicely with my three node installation :). > > > > Thank you very much for the advice. Actually I'm running OpenFOAM (read: > a program parallelized to run with Open MPI) from SLAX Live DVD, so the > installation paths are identical, as well as everything else. > > I've added commands that set enviromental variables in .bashrc on both > nodes, but you mention "common .bashrc". Common in what way? I'm sorry > for newbish question, again, I'm supposed to be a Mechanical Engineer. > :)))) > > OpenFOAM toolkit carries a separate directory for third-party support > software. In this directory there are programs for postprocessing > simulation results and analyze data and Open MPI. Therefore, in my case, > Open MPI is built in a separate directory and the build is automated. > > After the build of both programs, there is a special bashrc located in > > some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/ > > that sets all the variables needed to use Open FOAM, such as > FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working > dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets > LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found. > > I've tried this installation on the Live DVD on my laptop with two > cores, decomposed the case and ran the simulation in parallel without a > problem. > > I hope this information is more helpful. > > Best regards, > Tomislav > > _______________________________________________ > users mailing list > us...@open-mpi.org > http://www.open-mpi.org/mailman/listinfo.cgi/users