Justin. Thank you very much. using --disable-threads fixed the problem. Installation completed without error. And by the way, my gcc version is 4.0.1.

Matt



On May 24, 2011, at 8:22 PM, gmx-users-requ...@gromacs.org wrote:

1. Re: Re: Gromacs Installation Error on Powerbook G4 Running OS
     10.5.8


Matthew Bick wrote:

On May 23, 2011, at 4:24 PM, gmx-users-requ...@gromacs.org
<mailto:gmx-users-requ...@gromacs.org> wrote:

Re: Gromacs Installation Error on Powerbook G$ Running    OS
    10.5.8


Hi Justin.  Thanks for you response.  See my responses below, embedded
in the original message:
Matthew Bick wrote:


Dear Gromacs community. I am attempting to install the latest version
of Gromcs (4.5.4) on my Mac Powerbook G4 (1.67 MHz, OS 10.5.8).  I
have successfully installed FFTW following the instructions provided
on the Gromacs installation page.

"./configure --with-fft=fftw3" appears to work properly, but when I
perform  "Make" I get the following errors:

ld: symbol(s) not found
collect2: ld returned 1 exit status
make[4]: *** [libgmx.la] Error 1
make[3]: *** [all-recursive] Error 1
make[2]: *** [all-recursive] Error 1
make[1]: *** [all] Error 2
make: *** [all-recursive] Error 1


|The information about the actual missing symbols should be just above this |output, so that would be useful information. Also pertinent would be the |compiler versions you're using. From what you've posted below it looks
like
|GCC, but which version?

I am using the latest Developer's tools for Leopard 10.5.8.  So that
comes with gcc 4.0 and 4.2.  I
       can't say for certain which version was used during
configuration.  I do know that when 'configuration'
       looks for gcc it finds it.


If you're not specifying a custom PATH, then your default gcc is being detected.
 gcc -v will tell you which is being used.

   Here are the missing symbols, plus the error codes once again:

Undefined symbols:
"___sync_lock_test_and_set", referenced from:
_tMPI_Lock_trylock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Lock_lock in lock.o
_tMPI_Type_commit in type.o
_tMPI_Type_contiguous in type.o
"___sync_bool_compare_and_swap", referenced from:
_tMPI_Stack_detach in list.o
_tMPI_Stack_detach in list.o
_tMPI_Stack_detach in list.o
_tMPI_Stack_detach in list.o
_tMPI_Stack_detach in list.o
_tMPI_Stack_detach in list.o
_tMPI_Stack_detach in list.o
_tMPI_Stack_detach in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_push in list.o
_tMPI_Stack_pop in list.o
_tMPI_Once_wait in once.o
_tMPI_Once in once.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Prep_send_envelope in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Wait_process_incoming in p2p_protocol.o
_tMPI_Wait_process_incoming in p2p_protocol.o
_tMPI_Wait_process_incoming in p2p_protocol.o
_tMPI_Wait_process_incoming in p2p_protocol.o
"___sync_fetch_and_add", referenced from:
_tMPI_Once_wait in once.o
"___sync_lock_release", referenced from:
_tMPI_Lock_unlock in lock.o
_tMPI_Type_commit in type.o
_tMPI_Type_contiguous in type.o
"___sync_add_and_fetch", referenced from:
_tMPI_Alltoallv in alltoall.o
_tMPI_Alltoall in alltoall.o
_tMPI_Barrier_wait in barrier.o
_tMPI_Mult_recv in collective.o
_tMPI_Mult_recv in collective.o
_tMPI_Post_multi in collective.o
_tMPI_Post_multi in collective.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Xfer in p2p_protocol.o
_tMPI_Xfer in p2p_protocol.o
_tMPI_Wait_process_incoming in p2p_protocol.o
_tMPI_Reduce_fast in reduce.o
_tMPI_Reduce_fast in reduce.o
_tMPI_Scatterv in scatter.o
_tMPI_Scatter in scatter.o
"___sync_synchronize", referenced from:
_tMPI_Alltoallv in alltoall.o
_tMPI_Alltoallv in alltoall.o
_tMPI_Alltoall in alltoall.o
_tMPI_Alltoall in alltoall.o
_tMPI_Barrier_wait in barrier.o
_tMPI_Barrier_wait in barrier.o
_tMPI_Mult_recv in collective.o
_tMPI_Mult_recv in collective.o
_tMPI_Post_multi in collective.o
_tMPI_Post_multi in collective.o
_tMPI_Post_multi in collective.o
_tMPI_Event_wait in event.o
_tMPI_Lock_islocked in lock.o
_tMPI_Once_wait in once.o
_tMPI_Once_wait in once.o
_tMPI_Once_wait in once.o
_tMPI_Post_send in p2p_protocol.o
_tMPI_Xfer in p2p_protocol.o
_tMPI_Reduce_fast in reduce.o
_tMPI_Reduce_fast in reduce.o
_tMPI_Scatterv in scatter.o
_tMPI_Scatterv in scatter.o
_tMPI_Scatter in scatter.o
_tMPI_Scatter in scatter.o
    ld: symbol(s) not found
    collect2: ld returned 1 exit status
    make[4]: *** [libgmx.la] Error 1
    make[3]: *** [all-recursive] Error 1
    make[2]: *** [all-recursive] Error 1
    make[1]: *** [all] Error 2
    make: *** [all-recursive] Error 1



Threading is breaking down.  I believe the ability to build Gromacs with
threading support is compiler dependent, and you may need a newer gcc, although I could be wrong. I seem to recall having to upgrade to gcc 4.4.4 before the
newer features would work.

To test, configure with --disable-threads.  In this case, you'd have to
--enable-mpi for mdrun if you have multiple cores, and therefore invoke
mdrun_mpi via mpirun rather than through the internal threading library.

-Justin

|I don't think those warnings are particularly harmful, but I know that
trying to
|install a new version of Gromacs on PowerPC can be a challenge. If your
|compiler is relatively old, you may have to disable some of the newer
features,
|like threading, although that should have been caught during configuration.

|Is the ./configure command given above the exact command you used? If not,
|posting that would be useful.


     Yes, that ./configure command is the exact command I used.

     Thank you again for any suggestions.  I know everyone is busy.
-Matt


-Justin



I did configure my shell (bash) environment, as per the Gromacs
installation instructions.  "Make" runs for about 10 minutes before I
get these errors.  I have searched the mailing list, and have seen
people report problems similar to mine, but couldn't determine how
those problems were resolved.  Also from the mailing list, I
understand that the errors I've listed above aren't that informative.
If it helps, I repeatedly get the following warnings during "Make":

../../../../include/types/../thread_mpi/atomic/gcc_intrinsics.h: In
function ???tMPI_Atomic_add_return???:
../../../../include/types/../thread_mpi/atomic/gcc_intrinsics.h:46:
warning: implicit declaration of function ???__sync_add_and_fetch???
../../../../include/types/../thread_mpi/atomic/gcc_intrinsics.h: In
function ???tMPI_Atomic_fetch_add???:
../../../../include/types/../thread_mpi/atomic/gcc_intrinsics.h:51:
warning: implicit declaration of function ???__sync_fetch_and_add???
../../../../include/types/../thread_mpi/atomic/gcc_intrinsics.h: In
function ???tMPI_Atomic_cas???:
../../../../include/types/../thread_mpi/atomic/gcc_intrinsics.h:57:
warning: implicit declaration of function
???__sync_bool_compare_and_swap???

If anyone has any suggestions, they would be greatly appreciated.
Thanks.

Matthew Bick
Rockefeller University

--
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin


--
========================================

Justin A. Lemkul
Ph.D. Candidate
ICTAS Doctoral Scholar
MILES-IGERT Trainee
Department of Biochemistry
Virginia Tech
Blacksburg, VA
jalemkul[at]vt.edu | (540) 231-9080
http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
-- 
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to