Hi Ralph,

> I believe this was fixed in the trunk and is now scheduled to come
> across to 1.8.3

Today I installed openmpi-1.9a1r32664 and the problem still exists.
Is the backtrace helpful or do you need something else?

tyr java 111 ompi_info | grep MPI:
                Open MPI: 1.9a1r32664
tyr java 112 mpijavac InitFinalizeMain.java 
warning: [path] bad path element 
"/usr/local/openmpi-1.9_64_cc/lib64/shmem.jar": 
no such file or directory
1 warning
tyr java 113 /usr/local/gdb-7.6.1_64_gcc/bin/gdb 
/usr/local/openmpi-1.9_64_cc/bin/mpiexec 
GNU gdb (GDB) 7.6.1
Copyright (C) 2013 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
and "show warranty" for details.
This GDB was configured as "sparc-sun-solaris2.10".
For bug reporting instructions, please see:
<http://www.gnu.org/software/gdb/bugs/>...
Reading symbols from 
/export2/prog/SunOS_sparc/openmpi-1.9_64_cc/bin/orterun...done.
(gdb) run -np 1 java InitFinalizeMain 
Starting program: /usr/local/openmpi-1.9_64_cc/bin/mpiexec -np 1 java 
InitFinalizeMain
[Thread debugging using libthread_db enabled]
[New Thread 1 (LWP 1)]
[New LWP    2        ]
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0xffffffff7ea3c7f0, pid=3584, tid=2
#
# JRE version: Java(TM) SE Runtime Environment (8.0-b132) (build 1.8.0-b132)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.0-b70 mixed mode solaris-sparc 
compressed oops)
# Problematic frame:
# C  [libc.so.1+0x3c7f0]  strlen+0x50
#
# Failed to write core dump. Core dumps have been disabled. To enable core 
dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/fd1026/work/skripte/master/parallel/prog/mpi/java/hs_err_pid3584.log
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.sun.com/bugreport/crash.jsp
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
--------------------------------------------------------------------------
mpiexec noticed that process rank 0 with PID 0 on node tyr exited on signal 6 
(Abort).
--------------------------------------------------------------------------
[LWP    2         exited]
[New Thread 2        ]
[Switching to Thread 1 (LWP 1)]
sol_thread_fetch_registers: td_ta_map_id2thr: no thread can be found to satisfy 
query
(gdb) bt
#0  0xffffffff7f6173d0 in rtld_db_dlactivity () from /usr/lib/sparcv9/ld.so.1
#1  0xffffffff7f6175a8 in rd_event () from /usr/lib/sparcv9/ld.so.1
#2  0xffffffff7f618950 in lm_delete () from /usr/lib/sparcv9/ld.so.1
#3  0xffffffff7f6226bc in remove_so () from /usr/lib/sparcv9/ld.so.1
#4  0xffffffff7f624574 in remove_hdl () from /usr/lib/sparcv9/ld.so.1
#5  0xffffffff7f61d97c in dlclose_core () from /usr/lib/sparcv9/ld.so.1
#6  0xffffffff7f61d9d4 in dlclose_intn () from /usr/lib/sparcv9/ld.so.1
#7  0xffffffff7f61db0c in dlclose () from /usr/lib/sparcv9/ld.so.1
#8  0xffffffff7e4e6d88 in vm_close ()
   from /usr/local/openmpi-1.9_64_cc/lib64/libopen-pal.so.0
#9  0xffffffff7e4e4074 in lt_dlclose ()
   from /usr/local/openmpi-1.9_64_cc/lib64/libopen-pal.so.0
#10 0xffffffff7e53a1cc in ri_destructor (obj=0x0)
    at 
../../../../openmpi-1.9a1r32664/opal/mca/base/mca_base_component_repository.c:38
2
#11 0xffffffff7e5379a8 in opal_obj_run_destructors (object=0x0)
    at ../../../../openmpi-1.9a1r32664/opal/class/opal_object.h:446
#12 0xffffffff7e539a3c in mca_base_component_repository_release 
(component=0xf000)
    at 
../../../../openmpi-1.9a1r32664/opal/mca/base/mca_base_component_repository.c:24
0
#13 0xffffffff7e5400a0 in mca_base_component_unload (component=0x0, 
    output_id=-2145509376)
    at 
../../../../openmpi-1.9a1r32664/opal/mca/base/mca_base_components_close.c:47
#14 0xffffffff7e540144 in mca_base_component_close 
(component=0xffffff7b000030ff, 
    output_id=255)
    at 
../../../../openmpi-1.9a1r32664/opal/mca/base/mca_base_components_close.c:60
#15 0xffffffff7e540254 in mca_base_components_close (output_id=767, 
components=0x0, 
    skip=0xffffff7f73cdf800)
    at 
../../../../openmpi-1.9a1r32664/opal/mca/base/mca_base_components_close.c:86
#16 0xffffffff7e540194 in mca_base_framework_components_close (framework=0xff, 
    skip=0xffffff7c801c4000)
    at 
../../../../openmpi-1.9a1r32664/opal/mca/base/mca_base_components_close.c:68
#17 0xffffffff7ee49a58 in orte_oob_base_close ()
    at ../../../../openmpi-1.9a1r32664/orte/mca/oob/base/oob_base_frame.c:98
#18 0xffffffff7e56bcfc in mca_base_framework_close 
(framework=0xffffff7e4e3f3cff)
    at ../../../../openmpi-1.9a1r32664/opal/mca/base/mca_base_framework.c:187
#19 0xffffffff7bb13f00 in rte_finalize ()
    at ../../../../../openmpi-1.9a1r32664/orte/mca/ess/hnp/ess_hnp_module.c:857
#20 0xffffffff7ec3adf0 in orte_finalize ()
    at ../../openmpi-1.9a1r32664/orte/runtime/orte_finalize.c:66
#21 0x000000010000e264 in orterun (argc=4607, argv=0x0)
    at ../../../../openmpi-1.9a1r32664/orte/tools/orterun/orterun.c:1099
#22 0x00000001000046d4 in main (argc=255, argv=0xffffff7f0b067800)
    at ../../../../openmpi-1.9a1r32664/orte/tools/orterun/main.c:13
(gdb) 


Kind regards

Siegmar




> On Sep 2, 2014, at 4:21 AM, Siegmar Gross 
<siegmar.gr...@informatik.hs-fulda.de> wrote:
> 
> > Hi,
> > 
> > yesterday I installed openmpi-1.8.2 on my machines (Solaris 10 Sparc
> > (tyr), Solaris 10 x86_64 (sunpc0), and openSUSE Linux 12.1 x86_64
> > (linpc0)) with Sun C 5.12. A small Java program works on Linux,
> > but breaks with a segmentation fault on Solaris 10.
> > 
> > 
> > tyr java 172 where mpijavac
> > mpijavac is aliased to \mpijavac -deprecation -Xlint:all
> > /usr/local/openmpi-1.8.2_64_cc/bin/mpijavac
> > tyr java 173 mpijavac InitFinalizeMain.java 
> > tyr java 174 mpiexec -np 1 java InitFinalizeMain
> > #
> > # A fatal error has been detected by the Java Runtime Environment:
> > #
> > #  SIGSEGV (0xb) at pc=0xffffffff7ea3c7f0, pid=28334, tid=2
> > #
> > # JRE version: Java(TM) SE Runtime Environment (8.0-b132) (build 1.8.0-b132)
> > # Java VM: Java HotSpot(TM) 64-Bit Server VM (25.0-b70 mixed mode 
solaris-sparc compressed oops)
> > # Problematic frame:
> > # C  [libc.so.1+0x3c7f0]  strlen+0x50
> > #
> > # Failed to write core dump. Core dumps have been disabled. To enable core 
dumping, try "ulimit -c unlimited" 
> > before starting Java again
> > #
> > # An error report file with more information is saved as:
> > # 
/home/fd1026/work/skripte/master/parallel/prog/mpi/java/hs_err_pid28334.log
> > #
> > # If you would like to submit a bug report, please visit:
> > #   http://bugreport.sun.com/bugreport/crash.jsp
> > # The crash happened outside the Java Virtual Machine in native code.
> > # See problematic frame for where to report the bug.
> > #
> > --------------------------------------------------------------------------
> > mpiexec noticed that process rank 0 with PID 28334 on node tyr exited on 
signal 6 (Abort).
> > --------------------------------------------------------------------------
> > tyr java 175 
> > 
> > 
> > 
> > gbd shows the following backtrace for SunC 5.12.
> > 
> > tyr java 175 /usr/local/gdb-7.6.1_64_gcc/bin/gdb 
/usr/local/openmpi-1.8.2_64_cc/bin/mpiexec 
> > GNU gdb (GDB) 7.6.1
> > Copyright (C) 2013 Free Software Foundation, Inc.
> > License GPLv3+: GNU GPL version 3 or later 
<http://gnu.org/licenses/gpl.html>
> > This is free software: you are free to change and redistribute it.
> > There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
> > and "show warranty" for details.
> > This GDB was configured as "sparc-sun-solaris2.10".
> > For bug reporting instructions, please see:
> > <http://www.gnu.org/software/gdb/bugs/>...
> > Reading symbols from 
/export2/prog/SunOS_sparc/openmpi-1.8.2_64_cc/bin/orterun...done.
> > (gdb) run -np 1 java InitFinalizeMain 
> > Starting program: /usr/local/openmpi-1.8.2_64_cc/bin/mpiexec -np 1 java 
InitFinalizeMain
> > [Thread debugging using libthread_db enabled]
> > [New Thread 1 (LWP 1)]
> > [New LWP    2        ]
> > #
> > # A fatal error has been detected by the Java Runtime Environment:
> > #
> > #  SIGSEGV (0xb) at pc=0xffffffff7ea3c7f0, pid=28353, tid=2
> > #
> > # JRE version: Java(TM) SE Runtime Environment (8.0-b132) (build 1.8.0-b132)
> > # Java VM: Java HotSpot(TM) 64-Bit Server VM (25.0-b70 mixed mode 
solaris-sparc compressed oops)
> > # Problematic frame:
> > # C  [libc.so.1+0x3c7f0]  strlen+0x50
> > #
> > # Failed to write core dump. Core dumps have been disabled. To enable core 
dumping, try "ulimit -c unlimited" 
> > before starting Java again
> > #
> > # An error report file with more information is saved as:
> > # 
/home/fd1026/work/skripte/master/parallel/prog/mpi/java/hs_err_pid28353.log
> > #
> > # If you would like to submit a bug report, please visit:
> > #   http://bugreport.sun.com/bugreport/crash.jsp
> > # The crash happened outside the Java Virtual Machine in native code.
> > # See problematic frame for where to report the bug.
> > #
> > --------------------------------------------------------------------------
> > mpiexec noticed that process rank 0 with PID 28353 on node tyr exited on 
signal 6 (Abort).
> > --------------------------------------------------------------------------
> > [LWP    2         exited]
> > [New Thread 2        ]
> > [Switching to Thread 1 (LWP 1)]
> > sol_thread_fetch_registers: td_ta_map_id2thr: no thread can be found to 
satisfy query
> > (gdb) bt
> > #0  0xffffffff7f6173d0 in rtld_db_dlactivity () from 
/usr/lib/sparcv9/ld.so.1
> > #1  0xffffffff7f6175a8 in rd_event () from /usr/lib/sparcv9/ld.so.1
> > #2  0xffffffff7f618950 in lm_delete () from /usr/lib/sparcv9/ld.so.1
> > #3  0xffffffff7f6226bc in remove_so () from /usr/lib/sparcv9/ld.so.1
> > #4  0xffffffff7f624574 in remove_hdl () from /usr/lib/sparcv9/ld.so.1
> > #5  0xffffffff7f61d97c in dlclose_core () from /usr/lib/sparcv9/ld.so.1
> > #6  0xffffffff7f61d9d4 in dlclose_intn () from /usr/lib/sparcv9/ld.so.1
> > #7  0xffffffff7f61db0c in dlclose () from /usr/lib/sparcv9/ld.so.1
> > #8  0xffffffff7e8cb348 in vm_close ()
> >   from /usr/local/openmpi-1.8.2_64_cc/lib64/libopen-pal.so.6
> > #9  0xffffffff7e8c8634 in lt_dlclose ()
> >   from /usr/local/openmpi-1.8.2_64_cc/lib64/libopen-pal.so.6
> > #10 0xffffffff7e91edcc in ri_destructor (obj=0xff)
> >    at 
../../../../openmpi-1.8.2/opal/mca/base/mca_base_component_repository.c:391
> > #11 0xffffffff7e91c5a0 in opal_obj_run_destructors 
(object=0xffffff7c701d00ff)
> >    at ../../../../openmpi-1.8.2/opal/class/opal_object.h:446
> > #12 0xffffffff7e91e61c in mca_base_component_repository_release 
(component=0x10ff)
> >    at 
../../../../openmpi-1.8.2/opal/mca/base/mca_base_component_repository.c:244
> > #13 0xffffffff7e924c78 in mca_base_component_unload 
(component=0xffffff7f73c63800, 
> >    output_id=67583)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:47
> > #14 0xffffffff7e924d1c in mca_base_component_close 
(component=0xffffff0000000100, 
> >    output_id=268480767)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:60
> > #15 0xffffffff7e924e2c in mca_base_components_close (output_id=1947894015, 
> >    components=0xffffff7f501368ff, skip=0x2ff)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:86
> > #16 0xffffffff7e924d6c in mca_base_framework_components_close (
> >    framework=0xffffff7d7455d4ff, skip=0xffffff7f200a90ff)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:68
> > #17 0xffffffff7ee1d690 in orte_oob_base_close ()
> >    at ../../../../openmpi-1.8.2/orte/mca/oob/base/oob_base_frame.c:94
> > #18 0xffffffff7e954ac0 in mca_base_framework_close 
(framework=0xffffff0000004b00)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_framework.c:187
> > #19 0xffffffff7be139fc in rte_finalize ()
> >    at ../../../../../openmpi-1.8.2/orte/mca/ess/hnp/ess_hnp_module.c:858
> > #20 0xffffffff7ec38154 in orte_finalize ()
> >    at ../../openmpi-1.8.2/orte/runtime/orte_finalize.c:65
> > #21 0x000000010000ddf0 in orterun (argc=3327, argv=0x0)
> >    at ../../../../openmpi-1.8.2/orte/tools/orterun/orterun.c:1096
> > #22 0x0000000100004614 in main (argc=255, argv=0xffffff7f077de800)
> >    at ../../../../openmpi-1.8.2/orte/tools/orterun/main.c:13
> > (gdb) 
> > 
> > 
> > 
> > 
> > 
> > gbd shows the following backtrace for gcc-4.9.0.
> > 
> > tyr java 108 mpijavac InitFinalizeMain.java
> > tyr java 109 /usr/local/gdb-7.6.1_64_gcc/bin/gdb 
/usr/local/openmpi-1.8.2_64_gcc/bin/mpiexec 
> > GNU gdb (GDB) 7.6.1
> > Copyright (C) 2013 Free Software Foundation, Inc.
> > License GPLv3+: GNU GPL version 3 or later 
<http://gnu.org/licenses/gpl.html>
> > This is free software: you are free to change and redistribute it.
> > There is NO WARRANTY, to the extent permitted by law.  Type "show copying"
> > and "show warranty" for details.
> > This GDB was configured as "sparc-sun-solaris2.10".
> > For bug reporting instructions, please see:
> > <http://www.gnu.org/software/gdb/bugs/>...
> > Reading symbols from 
/export2/prog/SunOS_sparc/openmpi-1.8.2_64_gcc/bin/orterun...done.
> > (gdb) run -np 1 java InitFinalizeMain
> > Starting program: /usr/local/openmpi-1.8.2_64_gcc/bin/mpiexec -np 1 java 
InitFinalizeMain
> > [Thread debugging using libthread_db enabled]
> > [New Thread 1 (LWP 1)]
> > [New LWP    2        ]
> > #
> > # A fatal error has been detected by the Java Runtime Environment:
> > #
> > #  SIGSEGV (0xb) at pc=0xffffffff7ea3c7f0, pid=28454, tid=2
> > #
> > # JRE version: Java(TM) SE Runtime Environment (8.0-b132) (build 1.8.0-b132)
> > # Java VM: Java HotSpot(TM) 64-Bit Server VM (25.0-b70 mixed mode 
solaris-sparc compressed oops)
> > # Problematic frame:
> > # C  [libc.so.1+0x3c7f0]  strlen+0x50
> > #
> > # Failed to write core dump. Core dumps have been disabled. To enable core 
dumping, try "ulimit -c unlimited" 
> > before starting Java again
> > #
> > # An error report file with more information is saved as:
> > # 
/home/fd1026/work/skripte/master/parallel/prog/mpi/java/hs_err_pid28454.log
> > #
> > # If you would like to submit a bug report, please visit:
> > #   http://bugreport.sun.com/bugreport/crash.jsp
> > # The crash happened outside the Java Virtual Machine in native code.
> > # See problematic frame for where to report the bug.
> > #
> > --------------------------------------------------------------------------
> > mpiexec noticed that process rank 0 with PID 28454 on node tyr exited on 
signal 6 (Abort).
> > --------------------------------------------------------------------------
> > [LWP    2         exited]
> > [New Thread 2        ]
> > [Switching to Thread 1 (LWP 1)]
> > sol_thread_fetch_registers: td_ta_map_id2thr: no thread can be found to 
satisfy query
> > (gdb) bt
> > #0  0xffffffff7f6173d0 in rtld_db_dlactivity () from 
/usr/lib/sparcv9/ld.so.1
> > #1  0xffffffff7f6175a8 in rd_event () from /usr/lib/sparcv9/ld.so.1
> > #2  0xffffffff7f618950 in lm_delete () from /usr/lib/sparcv9/ld.so.1
> > #3  0xffffffff7f6226bc in remove_so () from /usr/lib/sparcv9/ld.so.1
> > #4  0xffffffff7f624574 in remove_hdl () from /usr/lib/sparcv9/ld.so.1
> > #5  0xffffffff7f61d97c in dlclose_core () from /usr/lib/sparcv9/ld.so.1
> > #6  0xffffffff7f61d9d4 in dlclose_intn () from /usr/lib/sparcv9/ld.so.1
> > #7  0xffffffff7f61db0c in dlclose () from /usr/lib/sparcv9/ld.so.1
> > #8  0xffffffff7ec77474 in vm_close ()
> >   from /usr/local/openmpi-1.8.2_64_gcc/lib64/libopen-pal.so.6
> > #9  0xffffffff7ec74a54 in lt_dlclose ()
> >   from /usr/local/openmpi-1.8.2_64_gcc/lib64/libopen-pal.so.6
> > #10 0xffffffff7ec99b78 in ri_destructor (obj=0x1001ead80)
> >    at 
../../../../openmpi-1.8.2/opal/mca/base/mca_base_component_repository.c:391
> > #11 0xffffffff7ec98490 in opal_obj_run_destructors (object=0x1001ead80)
> >    at ../../../../openmpi-1.8.2/opal/class/opal_object.h:446
> > #12 0xffffffff7ec993f4 in mca_base_component_repository_release (
> >    component=0xffffffff7b023ef0 <mca_oob_tcp_component>)
> >    at 
../../../../openmpi-1.8.2/opal/mca/base/mca_base_component_repository.c:244
> > #13 0xffffffff7ec9b73c in mca_base_component_unload (
> >    component=0xffffffff7b023ef0 <mca_oob_tcp_component>, output_id=-1)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:47
> > #14 0xffffffff7ec9b7d0 in mca_base_component_close (
> >    component=0xffffffff7b023ef0 <mca_oob_tcp_component>, output_id=-1)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:60
> > #15 0xffffffff7ec9b8a4 in mca_base_components_close (output_id=-1, 
> >    components=0xffffffff7f12b030 <orte_oob_base_framework+80>, skip=0x0)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:86
> > #16 0xffffffff7ec9b80c in mca_base_framework_components_close (
> >    framework=0xffffffff7f12afe0 <orte_oob_base_framework>, skip=0x0)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_components_close.c:66
> > #17 0xffffffff7efae0e8 in orte_oob_base_close ()
> >    at ../../../../openmpi-1.8.2/orte/mca/oob/base/oob_base_frame.c:94
> > #18 0xffffffff7ecb28b4 in mca_base_framework_close (
> >    framework=0xffffffff7f12afe0 <orte_oob_base_framework>)
> >    at ../../../../openmpi-1.8.2/opal/mca/base/mca_base_framework.c:187
> > #19 0xffffffff7bf078c0 in rte_finalize ()
> >    at ../../../../../openmpi-1.8.2/orte/mca/ess/hnp/ess_hnp_module.c:858
> > #20 0xffffffff7ef30924 in orte_finalize ()
> >    at ../../openmpi-1.8.2/orte/runtime/orte_finalize.c:65
> > #21 0x00000001000070c4 in orterun (argc=5, argv=0xffffffff7fffe0f8)
> >    at ../../../../openmpi-1.8.2/orte/tools/orterun/orterun.c:1096
> > #22 0x0000000100003d70 in main (argc=5, argv=0xffffffff7fffe0f8)
> >    at ../../../../openmpi-1.8.2/orte/tools/orterun/main.c:13
> > (gdb) 
> > 
> > 
> > 
> > I would be grateful, if somebody can fix the problem. Thank you
> > very much for any help in advance.
> > 
> > 
> > Kind regards
> > 
> > Siegmar
> > 
> > _______________________________________________
> > users mailing list
> > us...@open-mpi.org
> > Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> > Link to this post: 
http://www.open-mpi.org/community/lists/users/2014/09/25214.php
> 
> 

Reply via email to