Dear All,

> If you suspect a faulty installation, then you need to post more
> information, like:
> 
> - compilers used
> - cmake or autoconf?
> - hardware
> - commands used to install
> 
> -Justin
> 
> > Sincerely
> > 
> > Stephan Watkins
> > 


Compilers are all standard gnu compilers

I used autoconf with the commmands:

1) ./config --disable-float --enable-shared --prefix=$HOME --fftw=fftw3

-normal make, make install , then make distclean

2)./config --disable-float --enable-shared --prefix=$HOME --fftw=fftw3 
--enable-mpi

-make mdrun , make install-mdrun

-Hardware, University of Bern Ubelix system, main server is a normal intel i5 
processor I believe, with 4 processors, but not sure, using arch and is 
attached to the main hub of 1200 cpus aranged in 8 cpu nodes (ranging from 8 
AMD 6 core, to 8 older intel chips xeon or other), and secoundary 1200 CPU 
system (nordugrid), which is about the same, but more higher end chips at the 
moment.

-gromacs is installed local, with shared libraries.  All the commands print the 
help menu's

-as mentioned to the University cluster folks, I am not sure if it is the 
gromacs libraries, or the main shared libraries mentioned in the error, or 
something else.  I mean it looks like memory addresses read-write error's to 
me, but I ain't much of a system maintainer so don't know if it is a 
compilation error, something buggy on this system with the new gromacs, or if 
something changed in the libraries on the local machine.

Sincerely,

Stephan Watkins
-- 
Neu: GMX De-Mail - Einfach wie E-Mail, sicher wie ein Brief!  
Jetzt De-Mail-Adresse reservieren: http://portal.gmx.net/de/go/demail
-- 
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to