Hi,
has any one successfully combined OpenMPI and GotoBLAS2? I'm facing
segfaults in any program which combines the two libraries (as shared
libs). The segmentation fault seems to occur in MPI_Init(). The gdb
backtrace is
Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x424fb940 (LWP 9155)]
0x00002b2956ca9900 in opal_list_t_class () from
/share/apps/openmpi/1.3.2/gcc-4.1.2/lib/libopen-pal.so.0
(gdb) bt
#0 0x00002b2956ca9900 in opal_list_t_class () from
/share/apps/openmpi/1.3.2/gcc-4.1.2/lib/libopen-pal.so.0
#1 0x0000003df9c06307 in start_thread () from /lib64/libpthread.so.0
#2 0x0000003df90d1ded in clone () from /lib64/libc.so.6
#3 0x0000000000000000 in ?? ()
(For sake of completeness I have attached the test-program, you need to
compile it with -DSIZEOF_SCAL_T=8, e.g.
mpicc -DSIZEOF_SCAL_T=8 -o blas blas.c -L$GOTO2_LIB -lgoto2 -lgfortran).
I'm working on a Opteron x86_64 system running CentOs 5.2 . GotoBLAS2 V.
1.10 was compiled with gcc-4.1.2. This problem applies to OpenMPI 1.3.2
and OpenMPI 1.4.0 alike (both compiled with gcc-4.1.2).
I could post the `ompi_info --all` if anyone is interested but I think
both installations are quiet standard (openib, romio).
Thanks,
Dorian
#include <stdio.h>
#include <stdlib.h>
#include <mpi.h>
#if 4 == SIZEOF_SCAL_T
typedef float Scal_t;
#define GEMM sgemm_
#endif
#if 8 == SIZEOF_SCAL_T
typedef double Scal_t;
#define GEMM dgemm_
#endif
extern void GEMM(char *, char *, int *, int *, int *, Scal_t *, Scal_t *, int *,
Scal_t *, int *, Scal_t *, Scal_t *, int *);
int main(int argc, char **argv)
{
MPI_Init(&argc, &argv);
Scal_t A[2][3] = { { 5.0, 4.0, 2.0 }, { 1.0, -1.0, 0.0 } };
Scal_t B[5][2] = { { 1.0, 0.0 }, { 0.0, -1.0 }, { 0.5, 1.5 },
{ 6.0, 2.0 }, { 8.0, -2.0 } };
Scal_t C[3][5];
char transa = 'N';
char transb = 'N';
int N = 3;
int K = 2;
int M = 5;
int lda = N;
int ldb = K;
int ldc = N;
Scal_t alpha = 1.0;
Scal_t beta = 0.0;
GEMM(&transa, &transb, &N, &M, &K, &alpha, A, &lda, B, &ldb, &beta, C,
&ldc);
return MPI_Finalize();
}