Gilles,

I will try the 3.0.1rc1 version to see how it goes.

Thanks,

Vahid

On Jan 5, 2018, at 8:40 PM, Gilles Gouaillardet 
<gilles.gouaillar...@gmail.com<mailto:gilles.gouaillar...@gmail.com>> wrote:

 Vahid,

This looks like the description of the issue reported at 
https://github.com/open-mpi/ompi/issues/4336
The fix is currently available in 3.0.1rc1, and I will back port the fix fo the 
v2.x branch.
A workaround is to use ROM-IO instead of ompio, you can achieve this with
mpirun —mca io ^ompio ...
(FWIW 1.10 series use ROM-IO by default, so there is no leak out of the box)

IIRC, a possible (and ugly) workaround for the compilation issue is to
configure —with-ucx=/usr ...
That being said, you should really upgrade to a supported version of Open MPI 
as previously suggested

Cheers,

Gilles

On Saturday, January 6, 2018, Jeff Squyres (jsquyres) 
<jsquy...@cisco.com<mailto:jsquy...@cisco.com>> wrote:
You can still give Open MPI 2.1.1 a try.  It should be source compatible with 
EPW.  Hopefully the behavior is close enough that it should work.

If not, please encourage the EPW developers to upgrade.  v3.0.x is the current 
stable series; v1.10.x is ancient.



> On Jan 5, 2018, at 5:22 PM, Vahid Askarpour 
> <vh261...@dal.ca<mailto:vh261...@dal.ca>> wrote:
>
> Thank you Jeff for your suggestion to use the v.2.1 series.
>
> I am attempting to use openmpi with EPW. On the EPW website 
> (http://epw.org.uk/Main/DownloadAndInstall), it is stated that:
>
>> Compatibility of EPW
>>
>> EPW is tested and should work on the following compilers and libraries:
>>
>>      • gcc640 serial
>>      • gcc640 + openmpi-1.10.7
>>      • intel 12 + openmpi-1.10.7
>>      • intel 17 + impi
>>      • PGI 17 + mvapich2.3
>> EPW is know to have the following incompatibilities with:
>>
>>      • openmpi 2.0.2 (but likely on all the 2.x.x version): Works but memory 
>> leak. If you open and close a file a lot of times with openmpi 2.0.2, the 
>> memory increase linearly with the number of times the file is open.
>
> So I am hoping to avoid the 2.x.x series and use the 1.10.7 version suggested 
> by the EPW developers. However, it appears that this is not possible.
>
> Vahid
>
>> On Jan 5, 2018, at 5:06 PM, Jeff Squyres (jsquyres) 
>> <jsquy...@cisco.com<mailto:jsquy...@cisco.com>> wrote:
>>
>> I forget what the underlying issue was, but this issue just came up and was 
>> recently fixed:
>>
>>    https://github.com/open-mpi/ompi/issues/4345
>>
>> However, the v1.10 series is fairly ancient -- the fix was not applied to 
>> that series.  The fix was applied to the v2.1.x series, and a snapshot 
>> tarball containing the fix is available here (generally just take the latest 
>> tarball):
>>
>>    https://www.open-mpi.org/nightly/v2.x/
>>
>> The fix is still pending for the v3.0.x and v3.1.x series (i.e., there are 
>> pending pull requests that haven't been merged yet, so the nightly snapshots 
>> for the v3.0.x and v3.1.x branches do not yet contain this fix).
>>
>>
>>
>>> On Jan 5, 2018, at 1:34 PM, Vahid Askarpour 
>>> <vh261...@dal.ca<mailto:vh261...@dal.ca>> wrote:
>>>
>>> I am attempting to install openmpi-1.10.7 on CentOS Linux (7.4.1708) using 
>>> GCC-6.4.0.
>>>
>>> When compiling, I get the following error:
>>>
>>> make[2]: Leaving directory 
>>> '/home/vaskarpo/bin/openmpi-1.10.7/ompi/mca/pml/ob1'
>>> Making all in mca/pml/ucx
>>> make[2]: Entering directory 
>>> '/home/vaskarpo/bin/openmpi-1.10.7/ompi/mca/pml/ucx'
>>> CC       pml_ucx.lo
>>> CC       pml_ucx_request.lo
>>> CC       pml_ucx_datatype.lo
>>> CC       pml_ucx_component.lo
>>> CCLD     mca_pml_ucx.la<http://mca_pml_ucx.la/>
>>> libtool:   error: require no space between '-L' and '-lrt'
>>> make[2]: *** [Makefile:1725: mca_pml_ucx.la<http://mca_pml_ucx.la/>] Error 1
>>> make[2]: Leaving directory 
>>> '/home/vaskarpo/bin/openmpi-1.10.7/ompi/mca/pml/ucx'
>>> make[1]: *** [Makefile:3261: all-recursive] Error 1
>>> make[1]: Leaving directory '/home/vaskarpo/bin/openmpi-1.10.7/ompi'
>>> make: *** [Makefile:1777: all-recursive] Error 1
>>>
>>> Thank you,
>>>
>>> Vahid
>>> _______________________________________________
>>> users mailing list
>>> users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
>>> https://lists.open-mpi.org/mailman/listinfo/users
>>
>>
>> --
>> Jeff Squyres
>> jsquy...@cisco.com<mailto:jsquy...@cisco.com>
>>
>>
>>
>> _______________________________________________
>> users mailing list
>> users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
>> https://lists.open-mpi.org/mailman/listinfo/users
>
> _______________________________________________
> users mailing list
> users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
> https://lists.open-mpi.org/mailman/listinfo/users


--
Jeff Squyres
jsquy...@cisco.com<mailto:jsquy...@cisco.com>



_______________________________________________
users mailing list
users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users
_______________________________________________
users mailing list
users@lists.open-mpi.org<mailto:users@lists.open-mpi.org>
https://lists.open-mpi.org/mailman/listinfo/users

_______________________________________________
users mailing list
users@lists.open-mpi.org
https://lists.open-mpi.org/mailman/listinfo/users

Reply via email to