Re: [OMPI users] [gridengine users] h_vmem in jobs with mixture of openmpi and openmp

2012-09-06 Thread Reuti
Am 06.09.2012 um 13:21 schrieb Schmidt U.: >> > If h_vmem is defined in the script, what sense is then an additional vf > option in the script ? The h_vmem has per default higher value than vf, so it > must fit first to let the job run. If you want to avoid swapping, both should have the same

[OMPI users] Open-mx issue with ompi 1.6.1

2012-09-06 Thread Douglas Eadline
I built open-mpi 1.6.1 using the open-mx libraries. This worked previously and now I get the following error. Here is my system: kernel: 2.6.32-279.5.1.el6.x86_64 open-mx: 1.5.2 BTW, open-mx worked previously with open-mpi and the current version works with mpich2 $ mpiexec -np 8 -machinefile

Re: [OMPI users] SIGSEGV in OMPI 1.6.x

2012-09-06 Thread Yong Qin
Thanks Jeff. I will definitely do the failure analysis. But just wanted to confirm this isn't something special in OMPI itself, e.g., missing some configuration settings, etc. On Thu, Sep 6, 2012 at 5:01 AM, Jeff Squyres wrote: > If you run into a segv in this code, it almost certainly means tha

[OMPI users] MPI_Allreduce fail (minGW gfortran + OpenMPI 1.6.1)

2012-09-06 Thread Yonghui
Dear mpi users and developers, I am having some trouble with MPI_Allreduce. I am using MinGW (gcc 4.6.2) with OpenMPI 1.6.1. The MPI_Allreduce in c version works fine, but the fortran version failed with error. Here is the simple fortran code to reproduce the error: program ma

Re: [OMPI users] error compiling openmpi-1.6.1 on Windows 7

2012-09-06 Thread Shiqing Fan
Hi Siegmar, Glad to hear that it's working for you. The warning message is because the loopback adapter is excluded by default, but this adapter is actually not installed on Windows. One solution might be installing the loopback adapter on Windows. It very easy, only a few minutes. Or it m

Re: [OMPI users] MPI_Cart_sub periods

2012-09-06 Thread Jeff Squyres
John -- This cartesian stuff always makes my head hurt. :-) You seem to have hit on a bona-fide bug. I have fixed the issue in our SVN trunk and will get the fixed moved over to the v1.6 and v1.7 branches. Thanks for the report! On Aug 29, 2012, at 5:32 AM, Craske, John wrote: > Hello, >

Re: [OMPI users] python-mrmpi() failed

2012-09-06 Thread Jeff Squyres
On Sep 4, 2012, at 3:09 PM, mariana Vargas wrote: > I 'am new in this, I have some codes that use mpi for python and I > just installed (openmpi, mrmpi, mpi4py) in my home (from a cluster > account) without apparent errors and I tried to perform this simple > test in python and I get the fo

Re: [OMPI users] Regarding the Pthreads

2012-09-06 Thread Jeff Squyres
Your question is somewhat outside the scope of this list. Perhaps people may chime in with some suggestions, but that's more of a threading question than an MPI question. Be warned that you need to call MPI_Init_thread (not MPI_Init) with MPI_THREAD_MULTIPLE in order to get true multi-threaded

Re: [OMPI users] SIGSEGV in OMPI 1.6.x

2012-09-06 Thread Jeff Squyres
If you run into a segv in this code, it almost certainly means that you have heap corruption somewhere. FWIW, that has *always* been what it meant when I've run into segv's in any code under in opal/mca/memory/linux/. Meaning: my user code did something wrong, it created heap corruption, and t

Re: [OMPI users] Infiniband performance Problem and stalling

2012-09-06 Thread Yevgeny Kliteynik
On 9/3/2012 4:14 AM, Randolph Pullen wrote: > No RoCE, Just native IB with TCP over the top. Sorry, I'm confused - still not clear what is "Melanox III HCA 10G card". Could you run "ibstat" and post the results? What is the expected BW on your cards? Could you run "ib_write_bw" between two machin

Re: [OMPI users] error compiling openmpi-1.6.1 on Windows 7

2012-09-06 Thread Siegmar Gross
Hi Shiqing, I have solved the problem with the double quotes in OPENMPI_HOME but there is still something wrong. set OPENMPI_HOME="c:\Program Files (x86)\openmpi-1.6.1" mpicc init_finalize.c Cannot open configuration file "c:\Program Files (x86)\openmpi-1.6.1"/share/openmpi\mpicc-wrapper-data.t

[OMPI users] SIGSEGV in OMPI 1.6.x

2012-09-06 Thread Yong Qin
Hi, While debugging a mysterious crash of a code, I was able to trace down to a SIGSEGV in OMPI 1.6 and 1.6.1. The offending code is in opal/mca/memory/linux/malloc.c. Please see the following gdb log. (gdb) c Continuing. Program received signal SIGSEGV, Segmentation fault. opal_memory_ptmalloc2