Oh, and here's the directory that was built from the OpenMPI build:

C:\Program Files\OpenMPI_v1.4-win32\bin>dir
 Volume in drive C is Work
 Volume Serial Number is 6A2D-3CBC

 Directory of C:\Program Files\OpenMPI_v1.4-win32\bin

01/24/2010  05:14 AM    <DIR>          .
01/24/2010  05:14 AM    <DIR>          ..
01/24/2010  02:39 AM         2,637,824 libmpid.dll
01/24/2010  02:39 AM        11,268,096 libmpid.pdb
01/24/2010  02:39 AM           139,264 libmpi_cxxd.dll
01/24/2010  02:39 AM           962,560 libmpi_cxxd.pdb
01/24/2010  02:29 AM           458,752 libopen-pald.dll
01/24/2010  02:29 AM         1,953,792 libopen-pald.pdb
01/24/2010  02:30 AM           860,160 libopen-rted.dll
01/24/2010  02:30 AM         2,936,832 libopen-rted.pdb
07/11/2009  07:10 PM             1,870 Microsoft.VC80.CRT.manifest
01/24/2010  02:29 AM            44,544 mpic++.exe
01/24/2010  02:29 AM            44,544 mpicc.exe
01/24/2010  02:29 AM            44,544 mpicxx.exe
01/24/2010  02:29 AM            44,544 mpiexec.exe
01/24/2010  02:30 AM            98,304 mpirun.exe
07/12/2009  01:55 AM           479,232 msvcm80.dll
07/12/2009  01:55 AM           554,832 msvcp80.dll
07/12/2009  01:55 AM           632,656 msvcr80.dll
01/24/2010  02:30 AM            38,912 ompi-checkpoint.exe
01/24/2010  02:30 AM            31,744 ompi-clean.exe
01/24/2010  02:30 AM            48,640 ompi-ps.exe
01/24/2010  02:39 AM            34,304 ompi-server.exe
01/24/2010  02:39 AM           229,376 ompi_info.exe
01/24/2010  02:29 AM            37,888 opal-restart.exe
01/24/2010  02:29 AM            44,544 opal-wrapper.exe
01/24/2010  02:30 AM            38,912 orte-checkpoint.exe
01/24/2010  02:30 AM            31,744 orte-clean.exe
01/24/2010  02:30 AM            48,640 orte-ps.exe
01/24/2010  02:30 AM            29,184 orted.exe
01/24/2010  02:30 AM            98,304 orterun.exe
              29 File(s)     23,874,542 bytes
               2 Dir(s)  135,862,136,832 bytes free
-------- Original Message --------
Subject: [OMPI users] Windows CMake build problems ...
From: cjohn...@valverdecomputing.com
List-Post: users@lists.open-mpi.org Date: Tue, January 26, 2010 1:43 am
To: f...@hlrs.de
Cc: "Open MPI Users" <us...@open-mpi.org>

The mpicc, mpic++ and mpicxx apparently don't work, even though the rest of the wrapper commands do:


C:\prog\mon\examples>ompi_info
                 Package: Open MPI Charles Johnson@WORK Distribution
                Open MPI: 1.4
   Open MPI SVN revision: r22285
   Open MPI release date: Dec 08, 2009
                Open RTE: 1.4
   Open RTE SVN revision: r22285
   Open RTE release date: Dec 08, 2009
                    OPAL: 1.4
       OPAL SVN revision: r22285
       OPAL release date: Dec 08, 2009
            Ident string: 1.4
                  Prefix: C:\Program Files\OpenMPI_v1.4-win32
 Configured architecture: x86 Windows-6.1
          Configure host: WORK
           Configured by: Charles Johnson
           Configured on: 02:27 AM Sun 01/24/2010
          Configure host: WORK
                Built by: Charles Johnson
                Built on: 02:27 AM Sun 01/24/2010
              Built host: WORK
              C bindings: yes
            C++ bindings: yes
      Fortran77 bindings: no
      Fortran90 bindings: no
 Fortran90 bindings size: na
              C compiler: cl
     C compiler absolute: cl
            C++ compiler: cl
   C++ compiler absolute: cl
      Fortran77 compiler: CMAKE_Fortran_COMPILER-NOTFOUND
  Fortran77 compiler abs: none
      Fortran90 compiler:
  Fortran90 compiler abs: none
             C profiling: yes
           C++ profiling: yes
     Fortran77 profiling: no
     Fortran90 profiling: no
          C++ exceptions: no
          Thread support: no
           Sparse Groups: no
  Internal debug support: no
     MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
         libltdl support: no
   Heterogeneous support: no
 mpirun default --prefix: yes
         MPI I/O support: yes
       MPI_WTIME support: gettimeofday
Symbol visibility support: yes
   FT Checkpoint support: yes  (checkpoint thread: no)
           MCA backtrace: none (MCA v2.0, API v2.0, Component v1.4)
           MCA paffinity: windows (MCA v2.0, API v2.0, Component v1.4)
               MCA carto: auto_detect (MCA v2.0, API v2.0, Component v1.4)
           MCA maffinity: first_use (MCA v2.0, API v2.0, Component v1.4)
               MCA timer: windows (MCA v2.0, API v2.0, Component v1.4)
         MCA installdirs: windows (MCA v2.0, API v2.0, Component v1.4)
         MCA installdirs: env (MCA v2.0, API v2.0, Component v1.4)
         MCA installdirs: config (MCA v2.0, API v2.0, Component v1.4)
                 MCA crs: none (MCA v2.0, API v2.0, Component v1.4)
                 MCA dpm: orte (MCA v2.0, API v2.0, Component v1.4)
              MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.4)
           MCA allocator: basic (MCA v2.0, API v2.0, Component v1.4)
           MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.4)
                MCA coll: basic (MCA v2.0, API v2.0, Component v1.4)
                MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.4)
                MCA coll: self (MCA v2.0, API v2.0, Component v1.4)
                MCA coll: sm (MCA v2.0, API v2.0, Component v1.4)
                MCA coll: sync (MCA v2.0, API v2.0, Component v1.4)
               MCA mpool: rdma (MCA v2.0, API v2.0, Component v1.4)
               MCA mpool: sm (MCA v2.0, API v2.0, Component v1.4)
                 MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.4)
                 MCA bml: r2 (MCA v2.0, API v2.0, Component v1.4)
                 MCA btl: self (MCA v2.0, API v2.0, Component v1.4)
                 MCA btl: sm (MCA v2.0, API v2.0, Component v1.4)
                 MCA btl: tcp (MCA v2.0, API v2.0, Component v1.4)
                MCA topo: unity (MCA v2.0, API v2.0, Component v1.4)
                 MCA osc: pt2pt (MCA v2.0, API v2.0, Component v1.4)
                 MCA osc: rdma (MCA v2.0, API v2.0, Component v1.4)
                 MCA iof: hnp (MCA v2.0, API v2.0, Component v1.4)
                 MCA iof: orted (MCA v2.0, API v2.0, Component v1.4)
                 MCA iof: tool (MCA v2.0, API v2.0, Component v1.4)
                 MCA oob: tcp (MCA v2.0, API v2.0, Component v1.4)
                MCA odls: process (MCA v2.0, API v2.0, Component v1.4)
               MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.4)
               MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.4)
                 MCA rml: ftrm (MCA v2.0, API v2.0, Component v1.4)
                 MCA rml: oob (MCA v2.0, API v2.0, Component v1.4)
              MCA routed: binomial (MCA v2.0, API v2.0, Component v1.4)
              MCA routed: linear (MCA v2.0, API v2.0, Component v1.4)
                 MCA plm: process (MCA v2.0, API v2.0, Component v1.4)
              MCA errmgr: default (MCA v2.0, API v2.0, Component v1.4)
                 MCA ess: env (MCA v2.0, API v2.0, Component v1.4)
                 MCA ess: hnp (MCA v2.0, API v2.0, Component v1.4)
                 MCA ess: singleton (MCA v2.0, API v2.0, Component v1.4)
             MCA grpcomm: basic (MCA v2.0, API v2.0, Component v1.4)

C:\prog\mon\examples>mpirun --help
mpirun (Open MPI) 1.4

Usage: mpirun [OPTION]...  [PROGRAM]...
Start the given program using Open RTE

   -am <arg0>            Aggregate MCA parameter set file list
   --app <arg0>          Provide an appfile; ignore all other command line
                         options
   -bind-to-board|--bind-to-board
                         Whether to bind processes to specific boards
                         (meaningless on 1 board/node)
   -bind-to-core|--bind-to-core
                         Whether to bind processes to specific cores (the
                         default)
   -bind-to-none|--bind-to-none
                         Do not bind processes to cores or sockets
   -bind-to-socket|--bind-to-socket
                         Whether to bind processes to sockets
   -byboard|--byboard    Whether to assign processes round-robin by board
                         (equivalent to bynode if only 1 board/node)
   -bycore|--bycore      Alias for byslot
   -bynode|--bynode      Whether to assign processes round-robin by node
   -byslot|--byslot      Whether to assign processes round-robin by slot
                         (the default)
   -bysocket|--bysocket  Whether to assign processes round-robin by socket
-c|-np|--np <arg0>       Number of processes to run
   -cf|--cartofile <arg0>
                         Provide a cartography file ..........

... and so on. But when I run mpicc, nothing good happens:

C:\prog\mon\examples>mpicc hello_c.c
--------------------------------------------------------------------------
Sorry!  You were supposed to get help about:
    no-compiler-found
But I couldn't open the help file:
    C:\Program Files\OpenMPI_v1.4-win32\share\openmpi\help-opal-wrapper.txt: No such file or directory.  Sorry!
--------------------------------------------------------------------------

Looks like the wiring to the VCC compiler is disconnected.

Charlie ...

Reply via email to