Vahid,

i the v1.10 series, the default MPI-IO component was ROMIO based, and
in the v3 series, it is now ompio.
You can force the latest Open MPI to use the ROMIO based component with
mpirun --mca io romio314 ...

That being said, your description (e.g. a hand edited file) suggests
that I/O is not performed with MPI-IO,
which makes me very puzzled on why the latest Open MPI is crashing.

Cheers,

Gilles

On Fri, Jan 19, 2018 at 10:55 AM, Edgar Gabriel <egabr...@central.uh.edu> wrote:
> I will try to reproduce this problem with 3.0.x, but it might take me a
> couple of days to get to it.
>
> Since it seemed to have worked with 2.0.x (except for the running out file
> handles problem), there is the suspicion that one of the fixes that we
> introduced since then is the problem.
>
> What file system did you run it on? NFS?
>
> Thanks
>
> Edgar
>
>
> On 1/18/2018 5:17 PM, Jeff Squyres (jsquyres) wrote:
>>
>> On Jan 18, 2018, at 5:53 PM, Vahid Askarpour <vh261...@dal.ca> wrote:
>>>
>>> My openmpi3.0.x run (called nscf run) was reading data from a routine
>>> Quantum Espresso input file edited by hand. The preliminary run (called scf
>>> run) was done with openmpi3.0.x on a similar input file also edited by hand.
>>
>> Gotcha.
>>
>> Well, that's a little disappointing.
>>
>> It would be good to understand why it is crashing -- is the app doing
>> something that is accidentally not standard?  Is there a bug in (soon to be
>> released) Open MPI 3.0.1?  ...?
>>
>
> _______________________________________________
> users mailing list
> users@lists.open-mpi.org
> https://lists.open-mpi.org/mailman/listinfo/users
_______________________________________________
users mailing list
users@lists.open-mpi.org
https://lists.open-mpi.org/mailman/listinfo/users

Reply via email to