Hi,

> Not sure I understand the problem here - are you saying that
> the proc ran, but then mpiexec hangs instead of exiting?

No, mpiexec doesn't hang. It completes without any output.


tyr small_prog 60 /opt/solstudio12.3/bin/sparcv9/dbx \
  /usr/local/openmpi-1.9_64_cc/bin/mpiexec
For information about new features see `help changes'
To remove this message, put `dbxenv suppress_startup_message 7.9' in your .dbxrc
Reading mpiexec
Reading ld.so.1
Reading libopen-rte.so.0.0.0
...
(dbx) run -np 2 -host tyr,rs0 rank_size
Running: mpiexec -np 2 -host tyr,rs0 rank_size 
(process id 12341)
Reading libc_psr.so.1
Reading mca_shmem_mmap.so
...
Reading mca_dfs_app.so
Reading mca_dfs_orted.so
Reading mca_dfs_test.so

execution completed, exit code is 1
(dbx) 
(dbx) 
(dbx) 
(dbx) check -all
access checking - ON
memuse checking - ON
(dbx) run -np 2 -host tyr,rs0 rank_size
Running: mpiexec -np 2 -host tyr,rs0 rank_size 
(process id 12346)
Reading rtcapihook.so
Reading libdl.so.1
...
RTC: Running program...
Read from uninitialized (rui) on thread 1:
Attempting to read 1 byte at address 0xffffffff7fffbfcb
    which is 459 bytes above the current stack pointer
Variable is 'cwd'
t@1 (l@1) stopped in opal_getcwd at line 65 in file "opal_getcwd.c"
   65           if (0 != strcmp(pwd, cwd)) {
(dbx)



Kind regards

Siegmar



> On Jan 1, 2014, at 1:48 AM, Siegmar Gross 
<siegmar.gr...@informatik.hs-fulda.de> wrote:
> 
> > Nevertheless I have another problem with my small program.
> > 
> > tyr small_prog 158 uname -p
> > sparc
> > tyr small_prog 159 ssh rs0 uname -p
> > sparc
> > 
> > tyr small_prog 160 mpiexec rank_size
> > I'm process 0 of 1 available processes running on 
tyr.informatik.hs-fulda.de.
> > MPI standard 2.2 is supported.
> > 
> > tyr small_prog 161 ssh rs0 mpiexec rank_size
> > I'm process 0 of 1 available processes running on 
rs0.informatik.hs-fulda.de.
> > MPI standard 2.2 is supported.
> > 
> > tyr small_prog 162 mpiexec -np 2 -host tyr,rs0 rank_size
> > tyr small_prog 163 echo $status
> > 1
> > tyr small_prog 164 
> > 
> > 
> > The command works as expected on little endian machines.
> > 
> > linpc1 small_prog 93 mpiexec -np 2 -host linpc1,sunpc1 rank_size
> > I'm process 0 of 2 available processes running on linpc1.
> > MPI standard 2.2 is supported.
> > I'm process 1 of 2 available processes running on sunpc1.
> > MPI standard 2.2 is supported.
> > linpc1 small_prog 94 
> 

Reply via email to