If you had an older Open MPI installed into /usr/local before you installed 
Open MPI 1.8.4 into /usr/local, it's quite possible that some of the older 
plugins are still there (and will not play nicely with the 1.8.4 install).

Specifically: installing a new Open MPI does not uninstall an older Open MPI.

What you can probably do is

    rm -rf /usr/local/lib/openmpi

This will completely delete *all* Open MPI plugins (both new and old) from the 
/usr/local tree.

Then re-install the 1.8.4 again, and see if that works for you.



> On Feb 25, 2015, at 7:52 AM, Javier Mas Solé <javier.mas.s...@gmail.com> 
> wrote:
> 
> I have a fresh install of openmpi-1.8.4 in a  Mac with OSX-10.9.5. It 
> compiled and installed fine. 
> I have a Fortran code that runs perfectly on another similar machine with 
> openmpi-1.6.5. It compiled
> without error in  the new Mac. When I want to  mpirun, it gives the following 
>  message below.
> 
> Also if i write echo $PATH  I can spot the combinations us/local/bin which 
> was a warning in the installation instructions.
> I have read in other forums that this might signal a duplicity of versions 
> openmpi. I cannot rule this out, although
> I don’t find any duplicate in the use/local/bin folder.
> 
> I’m thinking of uninstalling this version and installing the 1.6.5 which 
> works fine. 
> ¿Anyone can tell me how to do this uninstall?
> 
> Thanks a lot
> 
> Javier
> 
> I have seen a similar post to this one 
> 
> fpmac114:AdSHW javier$ /usr/local/bin/mpirun sim1.exe
> [fpmac114.inv.usc.es:00398] mca: base: component_find: unable to open 
> /usr/local/lib/openmpi/mca_ess_slurmd: 
> dlopen(/usr/local/lib/openmpi/mca_ess_slurmd.so, 9): Symbol not found: 
> _orte_jmap_t_class
>   Referenced from: /usr/local/lib/openmpi/mca_ess_slurmd.so
>   Expected in: flat namespace
>  in /usr/local/lib/openmpi/mca_ess_slurmd.so (ignored)
> [fpmac114.inv.usc.es:00398] mca: base: component_find: unable to open 
> /usr/local/lib/openmpi/mca_errmgr_default: 
> dlopen(/usr/local/lib/openmpi/mca_errmgr_default.so, 9): Symbol not found: 
> _orte_errmgr_base_error_abort
>   Referenced from: /usr/local/lib/openmpi/mca_errmgr_default.so
>   Expected in: flat namespace
>  in /usr/local/lib/openmpi/mca_errmgr_default.so (ignored)
> [fpmac114.inv.usc.es:00398] mca: base: component_find: unable to open 
> /usr/local/lib/openmpi/mca_routed_cm: 
> dlopen(/usr/local/lib/openmpi/mca_routed_cm.so, 9): Symbol not found: 
> _orte_message_event_t_class
>   Referenced from: /usr/local/lib/openmpi/mca_routed_cm.so
>   Expected in: flat namespace
>  in /usr/local/lib/openmpi/mca_routed_cm.so (ignored)
> [fpmac114.inv.usc.es:00398] mca: base: component_find: unable to open 
> /usr/local/lib/openmpi/mca_routed_linear: 
> dlopen(/usr/local/lib/openmpi/mca_routed_linear.so, 9): Symbol not found: 
> _orte_message_event_t_class
>   Referenced from: /usr/local/lib/openmpi/mca_routed_linear.so
>   Expected in: flat namespace
>  in /usr/local/lib/openmpi/mca_routed_linear.so (ignored)
> [fpmac114.inv.usc.es:00398] mca: base: component_find: unable to open 
> /usr/local/lib/openmpi/mca_grpcomm_basic: 
> dlopen(/usr/local/lib/openmpi/mca_grpcomm_basic.so, 9): Symbol not found: 
> _opal_profile
>   Referenced from: /usr/local/lib/openmpi/mca_grpcomm_basic.so
>   Expected in: flat namespace
>  in /usr/local/lib/openmpi/mca_grpcomm_basic.so (ignored)
> [fpmac114.inv.usc.es:00398] mca: base: component_find: unable to open 
> /usr/local/lib/openmpi/mca_grpcomm_hier: 
> dlopen(/usr/local/lib/openmpi/mca_grpcomm_hier.so, 9): Symbol not found: 
> _orte_daemon_cmd_processor
>   Referenced from: /usr/local/lib/openmpi/mca_grpcomm_hier.so
>   Expected in: flat namespace
>  in /usr/local/lib/openmpi/mca_grpcomm_hier.so (ignored)
> [fpmac114.inv.usc.es:00398] mca: base: component_find: unable to open 
> /usr/local/lib/openmpi/mca_filem_rsh: 
> dlopen(/usr/local/lib/openmpi/mca_filem_rsh.so, 9): Symbol not found: 
> _opal_uses_threads
>   Referenced from: /usr/local/lib/openmpi/mca_filem_rsh.so
>   Expected in: flat namespace
>  in /usr/local/lib/openmpi/mca_filem_rsh.so (ignored)
> [fpmac114:00398] *** Process received signal ***
> [fpmac114:00398] Signal: Segmentation fault: 11 (11)
> [fpmac114:00398] Signal code: Address not mapped (1)
> [fpmac114:00398] Failing at address: 0x100000013
> [fpmac114:00398] [ 0] 0   libsystem_platform.dylib            
> 0x00007fff933125aa _sigtramp + 26
> [fpmac114:00398] [ 1] 0   ???                                 
> 0x00007fff5b7f00ff 0x0 + 140734728438015
> [fpmac114:00398] [ 2] 0   libopen-rte.7.dylib                 
> 0x0000000104469ee5 orte_rmaps_base_map_job + 1525
> [fpmac114:00398] [ 3] 0   libopen-pal.6.dylib                 
> 0x00000001044e4346 opal_libevent2021_event_base_loop + 2214
> [fpmac114:00398] [ 4] 0   mpirun                              
> 0x0000000104411bc0 orterun + 6320
> [fpmac114:00398] [ 5] 0   mpirun                              
> 0x00000001044102f2 main + 34
> [fpmac114:00398] [ 6] 0   libdyld.dylib                       
> 0x00007fff8d08a5fd start + 1
> [fpmac114:00398] [ 7] 0   ???                                 
> 0x0000000000000002 0x0 + 2
> [fpmac114:00398] *** End of error message ***
> Segmentation fault: 11
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post: 
> http://www.open-mpi.org/community/lists/users/2015/02/26394.php


-- 
Jeff Squyres
jsquy...@cisco.com
For corporate legal information go to: 
http://www.cisco.com/web/about/doing_business/legal/cri/

Reply via email to