Hi, $HOME means your installation directory (some/path as you mentioned).

On Sun, Aug 2, 2009 at 12:07 AM, Prasadcse Perera <prasadc...@gmail.com>wrote:

> Hi,
> common bashrc meant if the /home is network mounted so ignore that I guess.
> Have you tried  adding
>  . $HOME/OpenFOAM/OpenFOAM-1.5.x/etc/bashrc  to your ~/.bashrc on nodes ?
> This will append the configurations you need from the bashrc file located
> inside the directory.
>
> Prasad.
>
>
> On Sat, Aug 1, 2009 at 11:09 PM, Tomislav Maric <tomislav.ma...@gmx.com>wrote:
>
>> Prasadcse Perera wrote:
>> > Hi,
>> > One workaround is you can define PATH and LD_LIBRARY_PATH in your common
>> > .bashrc and have a resembling  paths of installation in two nodes. This
>> > works for me nicely with my three node installation :).
>> >
>>
>> Thank you very much for the advice. Actually I'm running OpenFOAM (read:
>> a program parallelized to run with Open MPI) from SLAX Live DVD, so the
>> installation paths are identical, as well as everything else.
>>
>> I've added commands that set enviromental variables in .bashrc on both
>> nodes, but you mention "common .bashrc". Common in what way? I'm sorry
>> for newbish question, again, I'm supposed to be a Mechanical Engineer.
>> :))))
>>
>> OpenFOAM toolkit carries a separate directory for third-party support
>> software. In this directory there are programs for postprocessing
>> simulation results and analyze data and Open MPI. Therefore, in my case,
>> Open MPI is built in a separate directory and the build is automated.
>>
>> After the build of both programs, there is a special bashrc located in
>>
>> some/path/OpenFOAM/OpenFOAM-1.5-dev/etc/
>>
>> that sets all the variables needed to use Open FOAM, such as
>> FOAM_TUTORIALS (where are the tutorials), FOAM_RUN (where is the working
>> dir), WM_COMPILER (what compiler to use), etc. This bashrc also sets
>> LD_LIBRARY_PATH and PATH so that locally installed Open MPI can be found.
>>
>> I've tried this installation on the Live DVD on my laptop with two
>> cores, decomposed the case and ran the simulation in parallel without a
>> problem.
>>
>> I hope this information is more helpful.
>>
>> Best regards,
>> Tomislav
>>
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>
>
>
>
> --
> http://www.codeproject.com/script/Articles/MemberArticles.aspx?amid=3489381
>



-- 
http://www.codeproject.com/script/Articles/MemberArticles.aspx?amid=3489381

Reply via email to