Hi Howard,

I've attached the file.

Best regards

Siegmar


Am 14.01.2016 um 18:40 schrieb Howard Pritchard:
HI Sigmar,

Would you mind posting your MsgSendRecvMain to the mail list?  I'd like to see
if I can
reproduce it on my linux box.

Thanks,

Howard




2016-01-14 7:30 GMT-07:00 Siegmar Gross <siegmar.gr...@informatik.hs-fulda.de
<mailto:siegmar.gr...@informatik.hs-fulda.de>>:

    Hi,

    I've successfully built openmpi-v1.10.1-140-g31ff573 on my machine
    (SUSE Linux Enterprise Server 12.0 x86_64) with gcc-5.2.0 and
    Sun C 5.13. Unfortunately I get warnings if I use my cc version
    running a Java program, although I added "-z noexecstack" to
    CFLAGS. I used the following commands to build the package.


    mkdir openmpi-v1.10.1-140-g31ff573-${SYSTEM_ENV}.${MACHINE_ENV}.64_cc
    cd openmpi-v1.10.1-140-g31ff573-${SYSTEM_ENV}.${MACHINE_ENV}.64_cc

    ../openmpi-v1.10.1-140-g31ff573/configure \
       --prefix=/usr/local/openmpi-1.10.2_64_cc \
       --libdir=/usr/local/openmpi-1.10.2_64_cc/lib64 \
       --with-jdk-bindir=/usr/local/jdk1.8.0_66/bin \
       --with-jdk-headers=/usr/local/jdk1.8.0_66/include \
       JAVA_HOME=/usr/local/jdk1.8.0_66 \
       LDFLAGS="-m64 -mt" \
       CC="cc" CXX="CC" FC="f95" \
       CFLAGS="-m64 -mt -z noexecstack" CXXFLAGS="-m64 -library=stlport4"
    FCFLAGS="-m64" \
       CPP="cpp" CXXCPP="cpp" \
       --enable-mpi-cxx \
       --enable-cxx-exceptions \
       --enable-mpi-java \
       --enable-heterogeneous \
       --enable-mpi-thread-multiple \
       --with-hwloc=internal \
       --without-verbs \
       --with-wrapper-cflags="-m64 -mt" \
       --with-wrapper-cxxflags="-m64 -library=stlport4" \
       --with-wrapper-fcflags="-m64" \
       --with-wrapper-ldflags="-mt" \
       --enable-debug \
       |& tee log.configure.$SYSTEM_ENV.$MACHINE_ENV.64_cc

    make |& tee log.make.$SYSTEM_ENV.$MACHINE_ENV.64_cc





    loki java 115 ompi_info | egrep -e "Open MPI repo revision:" -e "C
    compiler absolute:"
       Open MPI repo revision: v1.10.1-140-g31ff573
          C compiler absolute: /opt/solstudio12.4/bin/cc

    loki java 116 mpiexec -np 4 --host loki --slot-list 0:0-5,1:0-5 java
    MsgSendRecvMain
    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
    /usr/local/openmpi-1.10.2_64_cc/lib64/libmpi_java.so.1.2.0 which might
    have disabled stack guard. The VM will try to fix the stack guard now.
    It's highly recommended that you fix the library with 'execstack -c
    <libfile>', or link it with '-z noexecstack'.
    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
    /usr/local/openmpi-1.10.2_64_cc/lib64/libmpi_java.so.1.2.0 which might
    have disabled stack guard. The VM will try to fix the stack guard now.
    It's highly recommended that you fix the library with 'execstack -c
    <libfile>', or link it with '-z noexecstack'.
    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
    /usr/local/openmpi-1.10.2_64_cc/lib64/libmpi_java.so.1.2.0 which might
    have disabled stack guard. The VM will try to fix the stack guard now.
    It's highly recommended that you fix the library with 'execstack -c
    <libfile>', or link it with '-z noexecstack'.
    Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library
    /usr/local/openmpi-1.10.2_64_cc/lib64/libmpi_java.so.1.2.0 which might
    have disabled stack guard. The VM will try to fix the stack guard now.
    It's highly recommended that you fix the library with 'execstack -c
    <libfile>', or link it with '-z noexecstack'.

    Now 3 processes are sending greetings.

    Greetings from process 1:
       message tag:    3
       message length: 4
       message:        loki
    ...


    Does anybody know how I can get rid of the messages or can somebody
    fix the problem directly in the distribution? Please let me know if
    you need anything else. Thank you very much for any help in advance.


    Best regards

    Siegmar
    _______________________________________________
    users mailing list
    us...@open-mpi.org <mailto:us...@open-mpi.org>
    Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
    Link to this post:
    http://www.open-mpi.org/community/lists/users/2016/01/28275.php




_______________________________________________
users mailing list
us...@open-mpi.org
Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
Link to this post: 
http://www.open-mpi.org/community/lists/users/2016/01/28276.php

/* Small program that sends/receives a message with point-to-point
 * operations.
 *
 * "mpijavac" and Java-bindings are available in "Open MPI
 * version 1.7.4" or newer.
 *
 *
 * Class file generation:
 *   mpijavac MsgSendRecvMain.java
 *   mpijavac -d $HOME/mpi_classfiles MsgSendRecvMain.java
 *
 * Usage:
 *   mpiexec [parameters] java [parameters] MsgSendRecvMain
 *
 * Examples:
 *   mpiexec -np 2 java MsgSendRecvMain
 *   mpiexec -np 4 --host sunpc1,linpc1,rs0 \
 *      java -cp $HOME/mpi_classfiles MsgSendRecvMain
 *
 *
 * File: MsgSendRecvMain.java           Author: S. Gross
 * Date: 10.09.2015
 *
 */

import mpi.*;

public class MsgSendRecvMain
{
  static final int SENDTAG  = 1;        /* send message command         */
  static final int EXITTAG  = 2;        /* termination command          */
  static final int MSGTAG   = 3;        /* normal message tag           */
  static final int BUF_SIZE = 256;      /* message buffer size          */

  public static void main (String args[]) throws MPIException
  {
    int    mytid,                       /* my task id                   */
           ntasks,                      /* number of parallel tasks     */
           num,                         /* number of received data items*/
           i;                           /* loop variable                */
    char   buffer[];                    /* message buffer               */
    Status status;                      /* status of MPI operation      */

    MPI.Init (args);
    mytid  = MPI.COMM_WORLD.getRank ();
    ntasks = MPI.COMM_WORLD.getSize ();
    buffer = new char[BUF_SIZE];
    if (mytid == 0)
    {
      if (ntasks == 2)
      {
        System.out.println ("\nNow " + (ntasks - 1) + " process " +
                            "sends its greetings.\n");
      }
      else
      {
        System.out.println ("\nNow " + (ntasks - 1) + " processes " +
                            "are sending greetings.\n");
      }
      /* request messages                                               */
      for (i = 1; i < ntasks; ++i)
      {
        /* send only a message tag without a message                    */
        MPI.COMM_WORLD.send (buffer, 0, MPI.CHAR, i, SENDTAG);
      }
      /* wait for messages and print greetings                          */
      for (i = 1; i < ntasks; ++i)
      {
        status = MPI.COMM_WORLD.recv (buffer, BUF_SIZE, MPI.CHAR,
                                      MPI.ANY_SOURCE, MPI.ANY_TAG);
        num = status.getCount (MPI.CHAR);
        System.out.println ("Greetings from process " +
                            status.getSource () + ":\n" +
                            "  message tag:    " + status.getTag () +
                            "\n" +
                            "  message length: " + num +
                            "\n" +
                            "  message:        " +
                            String.valueOf (buffer, 0, num) + "\n");
      }
      /* terminate all processes                                        */
      for (i = 1; i < ntasks; ++i)
      {
        /* send only a message tag without a message                    */
        MPI.COMM_WORLD.send (buffer, 0, MPI.CHAR, i, EXITTAG);
      }

    }
    else
    {
      boolean more_to_do = true;

      while (more_to_do)
      {
        status = MPI.COMM_WORLD.recv (buffer, BUF_SIZE, MPI.CHAR, 0,
                                      MPI.ANY_TAG);
        if (status.getTag () != EXITTAG)
        {
          char processorName[];
          int  lengthOfName;

          processorName = (MPI.getProcessorName()).toCharArray();
          lengthOfName  = processorName.length;
          if (lengthOfName > BUF_SIZE)
          {
            System.out.println ("Message too large for buffer.\n" +
                                "I shorten it to buffer size.\n");
            lengthOfName = BUF_SIZE;
          }
          MPI.COMM_WORLD.send (processorName, lengthOfName, MPI.CHAR,
                               0, MSGTAG);
        }
        else
        {
          more_to_do = false;           /* terminate                    */
        }
      }
    }
    MPI.Finalize ();
  }
}

Reply via email to