Hello,
I've been having trouble for awhile now running some OpenMPI+IB jobs on
multiple tasks. The problems are all "hangs" and are not reproducible - the
same execution started again will in general proceed just fine where
previously it got stuck, but then later get stuck. These stuck processes a
The command I use to compile and run is:
mpic++ server.cc -o server && mpic++ client.cc -o client && mpirun -np 1
./server
Rodrigo
On Tue, Mar 20, 2012 at 3:40 PM, Rodrigo Oliveira wrote:
> Hi Edgar.
>
> Thanks for the response. The simplified code is attached: server, client
> and a .h contai
Hi Edgar.
Thanks for the response. The simplified code is attached: server, client
and a .h containing some constants. I put some "prints" to show the
behavior.
Regards
Rodrigo
On Tue, Mar 20, 2012 at 11:47 AM, Edgar Gabriel wrote:
> do you have by any chance the actual or a small reproducer
David -
No problem. Generally, you can get away with that trick whenever you're
compiling shared libraries and using the system compiler. The biggest
advantage is that the chance of the code being mis-compiled goes down
significantly when using the system compiler. The kernel is a good stress
t
You are right, Brian. I failed to build from a fresh untar, so I had some
leftover intel-compiled bits.
Your suggestion seems to work so I'll pass it along to the user to try out.
Thanks!
-david
--
David Gunter
HPC-3: Infrastructure Team
Los Alamos National Laboratory
On Mar 20, 2012, at 9
On 03/20/2012 08:35 AM, Gunter, David O wrote:
I wish it were that easy. When I go that route, I get error messages like the
following when trying to compile the parallel code with Intel:
libmpi.so: undefined reference to `__intel_sse2_strcpy'
and other messages for every single Intel-imple
That doesn't make a whole lot of sense; what compile / link line is
resulting in that error message? The error message is saying that
libmpi.so depends on an Intel built-in function, but since you built
libmpi.so with gcc, that shouldn't happen. Are you sure that libmpi.so
wasn't build against th
I tried that at first but hit the same problem I'm having now - lot's of
"__intel_" undefined errors when I go to compile even the simplest of mpi codes.
-david
--
David Gunter
HPC-3: Infrastructure Team
Los Alamos National Laboratory
On Mar 20, 2012, at 9:36 AM, Jeffrey Squyres wrote:
> Can
Can you build with g++ instead of icpc?
All the C++ MPI bindings are inlined anyway, so the performance difference
between the two might be negligible.
On Mar 20, 2012, at 11:35 AM, Gunter, David O wrote:
> I wish it were that easy. When I go that route, I get error messages like
> the follo
I wish it were that easy. When I go that route, I get error messages like the
following when trying to compile the parallel code with Intel:
libmpi.so: undefined reference to `__intel_sse2_strcpy'
and other messages for every single Intel-implemented standard C-function.
-david
--
David Gunte
do you have by any chance the actual or a small reproducer? It might be
much easier to hunt the problem down...
Thanks
Edgar
On 3/19/2012 8:12 PM, Rodrigo Oliveira wrote:
> Hi there.
>
> I am facing a very strange problem when using MPI_Barrier over an
> inter-communicator after some operations
On 3/20/12 10:06 AM, "Gunter, David O" wrote:
>I need to build ompi-1.4.3 (or the newer 1.4.5) with an older Intel 10.0
>compiler, but on a newer system in which the default g++ headers are
>incompatible with Intel. Thus the C and Fortran compilers function
>normally but the Intel C++ compiler fa
I need to build ompi-1.4.3 (or the newer 1.4.5) with an older Intel 10.0
compiler, but on a newer system in which the default g++ headers are
incompatible with Intel. Thus the C and Fortran compilers function normally but
the Intel C++ compiler fails to build even a simple "hello world" code.
I
Did you run autogen.pl?
(if you're working with the Open MPI trunk for development reasons, you might
want to post to the de...@open-mpi.org list, not the general users list)
On Mar 20, 2012, at 8:31 AM, Ilias Miroslav wrote:
> Dear all,
>
> I updated ompi-trunk to the most recent trunk:
>
>
Dear all,
I updated ompi-trunk to the most recent trunk:
il...@frpd2.utcpd.sk:~/bin/openmpi_i32lp64_intel_static/ompi-trunk/.svn info
Path: .
URL: http://svn.open-mpi.org/svn/ompi/trunk
Repository Root: http://svn.open-mpi.org/svn/ompi
Repository UUID: 63e3feb5-37d5-0310-a306-e8a459e722fe
Revisio
15 matches
Mail list logo