On Thursday 15 June 2006 16:08, Brock Palen wrote:
> Jezz i really cant read this morning, you are using torque and the
> mpiexec is the one with openmpi. I cant help you then someone else
> is going to have to. Sorry
Would it be much of a hassle to run a very simple mpi job (may even in an
i
On Thursday 15 June 2006 16:05, Brock Palen wrote:
> I dont know about ompi-1.0.3 snapshots, but we use ompi-1.0.2 with
> both torque-2.0.0p8 and torque-2.1.0p0 using the tm interface without
> any problems.
Hm, I just went for the 1.0.3 snapshots cause I couldn't get it to work with
1.0.2 so I
Jezz i really cant read this morning, you are using torque and the
mpiexec is the one with openmpi. I cant help you then someone else
is going to have to. Sorry
Brock Palen
Center for Advanced Computing
bro...@umich.edu
(734)936-1985
On Jun 15, 2006, at 9:42 AM, Martin Schafföner wrote:
I dont know about ompi-1.0.3 snapshots, but we use ompi-1.0.2 with
both torque-2.0.0p8 and torque-2.1.0p0 using the tm interface without
any problems.
Are you using PBSPro? OpenPBS?
As for you mpiexec is that the one included with OpenMPI (just a
symlink to orterun) or the one from
htt
Hi,
I have been trying to set up OpenMPI 1.0.3a1r10374 on our cluster and was
partly successful. Partly, because installation worked, compiling a simple
example and running it through the rsh pls also worked. However, I'm the only
user who has rsh access to the nodes, all other users must go th