Hi Gilles,

can you post your test program ?

Attached.

did you try to run the very same test with ompi configure'd without
--enable-heterogeneous ?

No.

can you reproduce the crash with the v2.x series ?

No, I tried, but wasn't successful :-)).


Kind regards

Siegmar


Cheers,

Gilles

On Tuesday, September 8, 2015, Siegmar Gross
<siegmar.gr...@informatik.hs-fulda.de
<mailto:siegmar.gr...@informatik.hs-fulda.de>> wrote:

    Hi,

    yesterday I have built openmpi-v1.10.0-5-ge0b85ea on my
    machines (Solaris 10 Sparc, Solaris 10 x86_64, and openSUSE
    Linux 12.1 x86_64) with gcc-5.1.0 and Sun C 5.13.

    Sometimes I have the following problem when I run a small Java
    program in my heterogeneous environment. The problem arises
    even if I only use my Linux box. It doesn't matter if I use
    my cc- or gcc-version of Open MPI. Sometimes I must run the
    program 10 times before the error shows up. I couldn't
    reproduce the error with openmpi-v2.x-dev-325-g8ae44ea.

    linpc1 fd1026 102 mpiexec -np 2 java MsgSendRecvMain
    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
    /usr/local/openmpi-1.10.1_64_cc/lib64/libmpi_java.so.1.2.0 which
    might have disabled stack guard. The VM will try to fix the stack
    guard now.
    It's highly recommended that you fix the library with 'execstack -c
    <libfile>', or link it with '-z noexecstack'.
    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
    /usr/local/openmpi-1.10.1_64_cc/lib64/libmpi_java.so.1.2.0 which
    might have disabled stack guard. The VM will try to fix the stack
    guard now.
    It's highly recommended that you fix the library with 'execstack -c
    <libfile>', or link it with '-z noexecstack'.

    Now 1 process sends its greetings.

    Greetings from process 1:
       message tag:    3
       message length: 6
       message:        linpc1



    linpc1 fd1026 102 mpiexec -np 4 java MsgSendRecvMain
    ... (above message)

    Now 3 processes are sending greetings.

    Greetings from process 3:
       message tag:    3
       message length: 6
       message:
    linpc1??????????????????????????????????????????????????????????????????|}~
    ?(?4???? ?????? ?????? ?????? ?????  ???
    9:!"?

    Greetings from process 2:
       message tag:    3


    Kind regards

    Siegmar



_______________________________________________
users mailing list
us...@open-mpi.org
Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
Link to this post: 
http://www.open-mpi.org/community/lists/users/2015/09/27545.php
/* Small program that sends/receives a message with point-to-point
 * operations.
 *
 * "mpijavac" and Java-bindings are available in "Open MPI
 * version 1.74" or newer.
 *
 *
 * Class file generation:
 *   mpijavac MsgSendRecvMain.java
 *
 * Usage:
 *   mpiexec [parameters] java [parameters] MsgSendRecvMain
 *
 * Examples:
 *   mpiexec java MsgSendRecvMain
 *   mpiexec -host sunpc1,linpc1,rs0 java -cp $HOME/mpi_classfiles \
 *      MsgSendRecvMain
 *
 *
 * File: MsgSendRecvMain.java           Author: S. Gross
 * Date: 19.09.2013
 *
 */

import mpi.*;

public class MsgSendRecvMain
{
  static final int SENDTAG  = 1;        /* send message command         */
  static final int EXITTAG  = 2;        /* termination command          */
  static final int MSGTAG   = 3;        /* normal message tag           */
  static final int BUF_SIZE = 256;      /* message buffer size          */

  public static void main (String args[]) throws MPIException
  {
    int    mytid,                       /* my task id                   */
           ntasks,                      /* number of parallel tasks     */
           num,                         /* number of received data items*/
           i;                           /* loop variable                */
    char   buffer[];                    /* message buffer               */
    Status status;                      /* status of MPI operation      */

    MPI.Init (args);
    mytid  = MPI.COMM_WORLD.getRank ();
    ntasks = MPI.COMM_WORLD.getSize ();
    buffer = new char[BUF_SIZE];
    if (mytid == 0)
    {
      if (ntasks == 2)
      {
        System.out.println ("\nNow " + (ntasks - 1) + " process " +
                            "sends its greetings.\n");
      }
      else
      {
        System.out.println ("\nNow " + (ntasks - 1) + " processes " +
                            "are sending greetings.\n");
      }
      /* request messages                                               */
      for (i = 1; i < ntasks; ++i)
      {
        MPI.COMM_WORLD.send (buffer, 0, MPI.CHAR, i, SENDTAG);
      }
      /* wait for messages and print greetings                          */
      for (i = 1; i < ntasks; ++i)
      {
        status = MPI.COMM_WORLD.recv (buffer, BUF_SIZE, MPI.CHAR,
                                      MPI.ANY_SOURCE, MPI.ANY_TAG);
        num = status.getCount (MPI.CHAR);
        System.out.println ("Greetings from process " +
                            status.getSource () + ":\n" +
                            "  message tag:    " + status.getTag () +
                            "\n" +
                            "  message length: " + num +
                            "\n" +
                            "  message:        " +
                            String.valueOf (buffer) + "\n");
      }
      /* terminate all processes                                        */
      for (i = 1; i < ntasks; ++i)
      {
        MPI.COMM_WORLD.send (buffer, 0, MPI.CHAR, i, EXITTAG);
      }

    }
    else
    {
      boolean more_to_do = true;

      while (more_to_do)
      {
        status = MPI.COMM_WORLD.recv (buffer, buffer.length, MPI.CHAR, 0,
                                      MPI.ANY_TAG);
        if (status.getTag () != EXITTAG)
        {
          buffer = (MPI.getProcessorName()).toCharArray();
          MPI.COMM_WORLD.send (buffer, buffer.length, MPI.CHAR,
                               0, MSGTAG);
        }
        else
        {
          more_to_do = false;           /* terminate                    */
        }
      }
    }
    MPI.Finalize ();
  }
}

Attachment: smime.p7s
Description: S/MIME Cryptographic Signature

Reply via email to