This problem was fixed in hwloc upstream recently.

https://github.com/open-mpi/hwloc/commit/790aa2e1e62be6b4f37622959de9ce3766ebc57e
Brice


Le 11/12/2014 23:40, Jorge D'Elia a écrit :
> Dear Jeff,
>
> Our updates of OpenMPI to 1.8.3 (and 1.8.4) were 
> all OK using Fedora >= 17 and system gcc compilers
> on ia32 or ia64 machines. 
>
> However, the "make all" step failed using Fedora 14 
> with a beta gcc 5.0 compiler on an ia32 machine 
> with message like:
>
> Error: symbol `Lhwloc1' is already defined
>
> A roundabout way to solve it was perform, first, 
> a separated installation of the hwloc package (we use 
> Release v1.10.0 (stable)) and, second, configure 
> OpenMPI using its flag: 
>
>   --with-hwloc=${HWLOC_HOME}
>
> although, in this way, the include and library path 
> must be given, e.g.
>
>  export CFLAGS="-I/usr/beta/hwloc/include" ; echo ${CFLAGS}
>  export LDFLAGS="-L/usr/beta/hwloc/lib"    ; echo ${LDFLAGS}
>  export LIBS="-lhwloc"                     ; echo ${LIBS}
>
> In order to verify that the hwloc works OK, it would be useful 
> to include in the OpenMPI distribution a simple test like
>
> $ gcc ${CFLAGS} ${LDFLAGS} -o hwloc-hello.exe hwloc-hello.c ${LIBS}
> $ ./hwloc-hello.exe
>
> (we apologize to forget to use the --with-hwloc-libdir flag ...).
>
> With this previous step we could overcome the fatal error 
> in the configuration step related to the hwloc package.
>
> This (fixed) trouble in the configuration step is the same 
> as the reported as:
>
> Open MPI 1.8.1: "make all" error: symbol `Lhwloc1' is already defined
>
> on 2014-08-12 15:08:38
>
>
> Regards,
> Jorge.
>
> ----- Mensaje original -----
>> De: "Jorge D'Elia" <jde...@intec.unl.edu.ar>
>> Para: "Open MPI Users" <us...@open-mpi.org>
>> Enviado: Martes, 12 de Agosto 2014 16:08:38
>> Asunto: Re: [OMPI users] Open MPI 1.8.1: "make all" error: symbol `Lhwloc1' 
>> is already defined
>>
>> Dear Jeff,
>>
>> These new versions of the tgz files replace the previous ones:
>> I had used an old outdated session environment. However, the
>> configuration and installation was OK again in each case.
>> Sorry for the noise caused by the previous tgz files.
>>
>> Regards,
>> Jorge.
>>
>> ----- Mensaje original -----
>>> De: "Jorge D'Elia" <jde...@intec.unl.edu.ar>
>>> Para: "Open MPI Users" <us...@open-mpi.org>
>>> Enviados: Martes, 12 de Agosto 2014 15:16:19
>>> Asunto: Re: [OMPI users] Open MPI 1.8.1: "make all" error: symbol `Lhwloc1'
>>> is already defined
>>>
>>> Dear Jeff,
>>>
>>> ----- Mensaje original -----
>>>> De: "Jeff Squyres (jsquyres)" <jsquy...@cisco.com>
>>>> Para: "Open MPI User's List" <us...@open-mpi.org>
>>>> Enviado: Lunes, 11 de Agosto 2014 11:47:29
>>>> Asunto: Re: [OMPI users] Open MPI 1.8.1: "make all" error: symbol
>>>> `Lhwloc1'
>>>> is already defined
>>>>
>>>> The problem appears to be occurring in the hwloc component in OMPI.
>>>> Can you download hwloc 1.7.2 (standalone) and try to build that on
>>>> the target machine and see what happens?
>>>>
>>>>     http://www.open-mpi.org/software/hwloc/v1.7/
>>> OK. Just in case I tried both version 1.7.2 and 1.9 (stable).
>>> Both gave no errors in the configuration or installation.
>>> Attached a *.tgz file for each case. Greetings. Jorge.
>>>
>>>  
>>>> On Aug 10, 2014, at 11:16 AM, Jorge D'Elia <jde...@intec.unl.edu.ar>
>>>> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> I tried to re-compile Open MPI 1.8.1 version for Linux
>>>>> on an ia32 machine with Fedora 14 although using the
>>>>> last version of Gfortran (Gfortran 4.10 is required
>>>>> by a user program which runs ok).
>>>>>
>>>>> However, the "make all" phase breaks with the
>>>>> error message:
>>>>>
>>>>>  Error: symbol `Lhwloc1' is already defined
>>>>>
>>>>> I attached a tgz file (tar -zcvf) with:
>>>>>
>>>>>  Output "configure.txt" from "./configure" Open MPI phase;
>>>>>  The "config.log" file from the top-level Open MPI directory;
>>>>>  Output "make.txt"    from "make all" to build Open MPI;
>>>>>  Output "make-v1.txt" from "make V=1" to build Open MPI;
>>>>>  Outputs from cat /proc/version and cat /proc/cpuinfo
>>>>>
>>>>> Please, any clue in order to fix?
>>>>>
>>>>> Regards in advance.
>>>>> Jorge.
>>>>>
>>>>> --
>>>>> CIMEC (UNL-CONICET) Predio CONICET-Santa Fe, Colectora Ruta Nac 168,
>>>>> Paraje El Pozo, S3000GLN Santa Fe, ARGENTINA, http://www.cimec.org.ar/
>>>>> Tel +54 342 451.15.94/95 ext 1018, fax: +54-342-451.11.69
>>>>> <symbol-already-defined.tgz>_______________________________________________
>>>>> users mailing list
>>>>> us...@open-mpi.org
>>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>> Link to this post:
>>>>> http://www.open-mpi.org/community/lists/users/2014/08/24953.php
>>>>
>>>> --
>>>> Jeff Squyres
>>>> jsquy...@cisco.com
>>>> For corporate legal information go to:
>>>> http://www.cisco.com/web/about/doing_business/legal/cri/
>>>>
>>>> _______________________________________________
>>>> users mailing list
>>>> us...@open-mpi.org
>>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>> Link to this post:
>>>> http://www.open-mpi.org/community/lists/users/2014/08/24975.php
>>>>
>>> _______________________________________________
>>> users mailing list
>>> us...@open-mpi.org
>>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>>> Link to this post:
>>> http://www.open-mpi.org/community/lists/users/2014/08/25002.php
>> --
>> CIMEC (UNL-CONICET) Predio CONICET-Santa Fe, Colectora Ruta Nac 168,
>> Paraje El Pozo, S3000GLN Santa Fe, ARGENTINA, http://www.cimec.org.ar/
>> Tel +54 342 451.15.94/95 ext 1018, fax: +54-342-451.11.69
>>
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org
>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>> Link to this post:
>> http://www.open-mpi.org/community/lists/users/2014/08/25005.php
>>
>>
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org
>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
>> Link to this post: 
>> http://www.open-mpi.org/community/lists/users/2014/12/25964.php

Reply via email to