Hello,
I am using OpenMPI 2.0.0 with a computational fluid dynamics
software and I am encountering a series of errors when running this with
mpirun. This is my lscpu output
CPU(s):4
On-line CPU(s) list: 0-3
Thread(s) per core:2
Core(s) per socket:2
Socket(s):
: ompi_datatype_pack_external_size
FAIL external32 (exit status:
On Tue, Jun 13, 2017 at 5:24 PM, ashwin .D wrote:
> Hello,
> I am using OpenMPI 2.0.0 with a computational fluid dynamics
> software and I am encountering a series of errors when running this with
> mpirun. This is my
s with the sequential form of my application and it
is much slower although I am using shared memory and all the cores are in
the same machine.
Best regards,
Ashwin.
On Tue, Jun 13, 2017 at 5:52 PM, ashwin .D wrote:
> Also when I try to build and run a make check I get these errors - Am I
>
useful. Please give me a couple of days to implement some of the ideas
that you both have suggested and allow me to get back to you.
Best regards,
Ashwin
On Wed, Jun 14, 2017 at 4:01 PM, ashwin .D wrote:
> Hello,
> I found a thread with Intel MPI(although I am using gfortran
>
Hello Gilles,
I am enclosing all the information you requested.
1) as an attachment I enclose the log file
2) I did rebuild OpenMPI 2.1.1 with the --enable-debug feature and I
reinstalled it /usr/lib/local.
I ran all the examples in the examples directory. All passed except
osh
There is a sequential version of the same program COSMO (no reference to
MPI) that I can run without any problems. Of course it takes a lot longer
to complete. Now I also ran valgrind (not sure whether that is useful or
not) and I have enclosed the logs.
On Sat, Jun 17, 2017 at 7:20 PM, ashwin .D
fixed
>so it should be easier to figure out what is going wrong
>Cheers,
>Gilles
On Sun, Jun 18, 2017 at 11:41 AM, ashwin .D wrote:
> There is a sequential version of the same program COSMO (no reference to
> MPI) that I can run without any problems. Of course it takes a lot longe