thanks for all the response. I try the suggestions. I think I ll have to do it 
not with fftw then.

--- On Mon, 7/4/11, Mirco Wahab <mirco.wa...@chemie.tu-freiberg.de> wrote:


From: Mirco Wahab <mirco.wa...@chemie.tu-freiberg.de>
Subject: Re: [gmx-users] On multi-core PCs and gromacs installation
To: gmx-users@gromacs.org
Date: Monday, July 4, 2011, 11:08 PM


>> I will be installing gromacs 4.5.x in another computer but this time with 
>> four cores. The PC runs in windows and I will be using cygwin.
>> ... Do I still need to install MPI using cygwin?
> 
> Probably not, but I haven't tested threading on Cygwin.

I just did a test for fun and it worked remarkably good,
even on Cygwin 1.7 +  Win7/x64U box. I'ts very simple
using 'make', didn't check cmake (seems more complicated
here).

The following sequence will lead to success for
fully functional Gromacs 4.5.4 for the CYGWIN_i686
target on windows:

Install Cygwin 1.7.9 from its page, add
+ gcc/g++ 4.x and 3.x (but not mingw-gcc variants)
+ make/cmake
+ lapack, lapack-devel, lapack-libs
+ fftw3, *-devel, *-libs
+ gsl, *-devel, *-libs
+ wget, tar, vim
+ bash, rxvt
(do not install the 'pthread library stub' entry)

Open a cygwin bash shell, create a gromacs base directory
and change to it(!):
$> mkdir gromacs; cd gromacs

Download source and unpack it:
$> wget ftp://ftp.gromacs.org/pub/gromacs/gromacs-4.5.4.tar.gz
$> tar xzf gromacs-4.5.4.tar.gz

Write a fancy build control file:
$> cat > mk_gromacs.sh
#!/bin/sh
../gromacs-4.5.4/configure CC=gcc CXX=g++ \
            LDFLAGS="-L/usr/lib64 -llapack -lblas -lpthread" \
            --prefix=/usr/local/gromacs454 \
        --with-fft=fftw3     \
            --with-external-blas  \
            --with-external-lapack \
        --with-gsl

  if [ $? -eq 0 ]; then
     make -j 4
     if [ $? -eq 0 ]; then
     echo "********************"
     echo "Success!"
     echo "********************"
     echo "now: ==>  make install (as root) / exit"
     fi
  fi

Modify the number in 'make -j 4' to match the number of
your processor cores, which is the output of:
$> echo $NUMBER_OF_PROCESSORS

Create a build directory and change to it(!):
$> mkdir build; cd build

Run your control script (see above):
$> sh ../mk_gromacs.sh

Wait and check error messages during configuration (if any),
otherwise: go and get a large cup of coffee and lay back.

If its ready (without errors), install it by:
$> make install

Initialize Gromacs environment variables in
your shell by modifying your .bashrc file:

$> vi .bashrc
- go some lines down, at the start of an empty line
- press 'i' (insert)
- insert the following text

  # GROMACS
  if [ -e /usr/local/gromacs454/bin/GMXRC.bash ] ; then
   source /usr/local/gromacs454/bin/GMXRC.bash
  fi

- press 'Escape' ':' 'x' (to write and close the file)

After leaving the shell and opening a new one,
mdrun, grommp and friends should be available
and working fine.

> Nothing will do MPI for you. Threading and MPI are complementary approaches 
> to achieving parallelism, and which is better depends on your execution 
> environment.

On Cygwin, OpenMPI would't even work anymore as
they require MS Visual Studio Library linkage
nowadays. One could try MPICH2 which compiled
on Cygwin last time I tried (2 years ago?),
but why? Gromacs threading works perfectly
on Cygwin.

Regards

M.
-- gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the www interface or 
send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
-- 
gmx-users mailing list    gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

Reply via email to