Re: [gmx-users] mpi segmentation error in continuation of REMD simulation with gromacs 4.5.5

2013-11-08 Thread Mark Abraham
OK, thanks. Please open a new issue at redmine.gromacs.org, describe your observations as above, and upload a tarball of your input files. Mark On Fri, Nov 8, 2013 at 2:14 PM, Qin Qiao wrote: > On Fri, Nov 8, 2013 at 7:18 PM, Mark Abraham >wrote: > > > Hi, > > > > That shouldn't happen if yo

Re: [gmx-users] mpi segmentation error in continuation of REMD simulation with gromacs 4.5.5

2013-11-08 Thread Qin Qiao
On Fri, Nov 8, 2013 at 7:18 PM, Mark Abraham wrote: > Hi, > > That shouldn't happen if your MPI library is working (have you tested it > with other programs?) and configured properly. It's possible this is a > known bug, so please let us know if you can reproduce it in the latest > releases. > > M

Re: [gmx-users] mpi segmentation error in continuation of REMD simulation with gromacs 4.5.5

2013-11-08 Thread Mark Abraham
Hi, That shouldn't happen if your MPI library is working (have you tested it with other programs?) and configured properly. It's possible this is a known bug, so please let us know if you can reproduce it in the latest releases. Mark On Fri, Nov 8, 2013 at 6:55 AM, Qin Qiao wrote: > Dear all,

[gmx-users] mpi segmentation error in continuation of REMD simulation with gromacs 4.5.5

2013-11-07 Thread Qin Qiao
Dear all, I'm trying to continue a REMD simulation using gromacs4.5.5 under NPT ensemble, and I got the following errors when I tried to use 2 cores per replica: "[node-ib-4.local:mpi_rank_25][error_sighandler] Caught error: Segmentation fault (signal 11) [node-ib-13.local:mpi_rank_63][error_sigh

Re: [gmx-users] MPI runs on a local computer

2013-09-20 Thread Mark Abraham
On Thu, Sep 19, 2013 at 2:48 PM, Xu, Jianqing wrote: > > Dear all, > > I am learning the parallelization issues from the instructions on Gromacs > website. I guess I got a rough understanding of MPI, thread-MPI, OpenMP. But > I hope to get some advice about a correct way to run jobs. > > Say I h

Re: [gmx-users] MPI runs on a local computer

2013-09-20 Thread Carsten Kutzner
Hi Jianqing, On Sep 19, 2013, at 2:48 PM, "Xu, Jianqing" wrote: > Say I have a local desktop having 16 cores. If I just want to run jobs on one > computer or a single node (but multiple cores), I understand that I don't > have to install and use OpenMPI, as Gromacs has its own thread-MPI includ

[gmx-users] MPI runs on a local computer

2013-09-19 Thread Xu, Jianqing
Dear all, I am learning the parallelization issues from the instructions on Gromacs website. I guess I got a rough understanding of MPI, thread-MPI, OpenMP. But I hope to get some advice about a correct way to run jobs. Say I have a local desktop having 16 cores. If I just want to run jobs on

Re: [gmx-users] mpi enabled gmx 4.5

2013-04-15 Thread Justin Lemkul
On 4/15/13 12:43 PM, 라지브간디 wrote: Dear gmx users, I have installed the mpi (openmpi) enabled gmx4.5.5 version by following command line.It does installed without any errors. However, when i see the folder of installation it doesn't have any files under gromacs except bin (only has mdrun_mp

[gmx-users] mpi enabled gmx 4.5

2013-04-15 Thread 라지브간디
Dear gmx users, I have installed the mpi (openmpi) enabled gmx4.5.5 version by following command line.It does installed without any errors. However, when i see the folder of installation it doesn't have any files under gromacs except bin (only has mdrun_mpi) and lib folder has lib files. I can

Re: [gmx-users] MPI oversubscription

2013-02-06 Thread Roland Schulz
>> > ret = gmx_omp_get_num_procs(); >> > } >> > >> > >> > Cheers, >> > >> > Berk >> > >> > >> > > Date: Tue, 5 Feb 2013 14:27:44 +0100 >> > &

Re: [gmx-users] MPI oversubscription

2013-02-06 Thread Roland Schulz
> To make it work for now, you can insert immediately after #ifdef > > GMX_OMPENMP: > > if (ret <= 0) > > { > > ret = gmx_omp_get_num_procs(); > > } > > > > > > Cheers, > > > > Berk > > > > -

Re: [gmx-users] MPI oversubscription

2013-02-06 Thread Christian H.
http://pastebin.com/6t0y5mTX So it's the same problem with io.h etc. if I just run cmake without any mpi flags in my build directory. -- gmx-users mailing listgmx-users@gromacs.org http://lists.gromacs.org/mailman/listinfo/gmx-users * Please search the archive at http://www.gromacs.org/Suppo

RE: [gmx-users] MPI oversubscription

2013-02-06 Thread Berk Hess
need openmpi, as you can use the Gromacs built-in thread-MPI library. Cheers, Berk > Date: Wed, 6 Feb 2013 09:38:09 +0100 > Subject: Re: [gmx-users] MPI oversubscription > From: hypo...@googlemail.com > To: gmx-users@gromacs.org > > And if

Re: [gmx-users] MPI oversubscription

2013-02-06 Thread Christian H.
es >>> > are not found. >>> > >>> >>> Could you post your CMakeFiles/CMakeError.log? That should show why those >>> features are disabled. >>> >>> Roland >>> >>> >>> > >>> > Cheers, >&

Re: [gmx-users] MPI oversubscription

2013-02-06 Thread Christian H.
t;> > are not found. >> > >> >> Could you post your CMakeFiles/CMakeError.log? That should show why those >> features are disabled. >> >> Roland >> >> >> > >> > Cheers, >> > >> > Berk >> > >> >

Re: [gmx-users] MPI oversubscription

2013-02-05 Thread Roland Schulz
found. > Could you post your CMakeFiles/CMakeError.log? That should show why those features are disabled. Roland > > Cheers, > > Berk > > > ------------ > > Date: Tue, 5 Feb 2013 14:52:17 +0100 > > Subject: Re: [gmx-users] MPI ove

RE: [gmx-users] MPI oversubscription

2013-02-05 Thread Berk Hess
thout OpenMP. Did you disable that manually? Also large file support is not turned on. It seems like your build setup is somehow messed up and lot of features are not found. Cheers, Berk > Date: Tue, 5 Feb 2013 14:52:17 +0100 > Subject: Re: [gmx

Re: [gmx-users] MPI oversubscription

2013-02-05 Thread Christian H.
-------- > > Date: Tue, 5 Feb 2013 14:27:44 +0100 > > Subject: Re: [gmx-users] MPI oversubscription > > From: hypo...@googlemail.com > > To: gmx-users@gromacs.org > > > > None of the variables referenced here are set on my system, the print > > stateme

RE: [gmx-users] MPI oversubscription

2013-02-05 Thread Berk Hess
immediately after  #ifdef GMX_OMPENMP:     if (ret <= 0)     {     ret = gmx_omp_get_num_procs();     } Cheers, Berk > Date: Tue, 5 Feb 2013 14:27:44 +0100 > Subject: Re: [gmx-users] MPI oversubscription > From: hypo...@googlemail.com >

Re: [gmx-users] MPI oversubscription

2013-02-05 Thread Christian H.
e of the cases > is called. > > Cheers, > > Berk > > > > > Date: Tue, 5 Feb 2013 13:45:02 +0100 > > Subject: Re: [gmx-users] MPI oversubscription > > From: hypo...@googlemail.com > > To: gmx-users@gromacs.org &g

RE: [gmx-users] MPI oversubscription

2013-02-05 Thread Berk Hess
Feb 2013 13:45:02 +0100 > Subject: Re: [gmx-users] MPI oversubscription > From: hypo...@googlemail.com > To: gmx-users@gromacs.org > > >From the .log file: > > Present hardware specification: > Vendor: GenuineIntel > Brand: Intel(R) Core(TM) i7-2600K CPU @ 3.40GHz > Family

Re: [gmx-users] MPI oversubscription

2013-02-05 Thread Christian H.
c acceleration."? > > Cheers, > > Berk > > > > > Date: Tue, 5 Feb 2013 11:38:53 +0100 > > From: hypo...@googlemail.com > > To: gmx-users@gromacs.org > > Subject: [gmx-users] MPI oversubscription > > > > Hi, > > > > I am using the

RE: [gmx-users] MPI oversubscription

2013-02-05 Thread Berk Hess
gt; Date: Tue, 5 Feb 2013 11:38:53 +0100 > From: hypo...@googlemail.com > To: gmx-users@gromacs.org > Subject: [gmx-users] MPI oversubscription > > Hi, > > I am using the latest git version of gromacs, compiled with gcc 4.6.2 and > openmpi 1.6.3. > I start the progra

[gmx-users] MPI oversubscription

2013-02-05 Thread Christian H.
Hi, I am using the latest git version of gromacs, compiled with gcc 4.6.2 and openmpi 1.6.3. I start the program using the usual mpirun -np 8 mdrun_mpi ... This always leads to a warning: Using 1 MPI process WARNING: On node 0: oversubscribing the available 0 logical CPU cores per node with 1 MPI

Re: [gmx-users] MPI segmentation Fault

2013-01-02 Thread Xu Dong Huang
Dear Justin, I will try your suggestion to rerun it on Berendsen coupling and see if there is any further problems. The system indeed contain the star shaped structure I showed you earlier (running multiple versions, containing from 1, 5, 10, 15 and 20 stars in a solvent box), the solvent is ju

Re: [gmx-users] MPI segmentation Fault

2013-01-02 Thread Justin Lemkul
On 1/2/13 3:42 PM, Xu Dong Huang wrote: @Justin, here are the details of my NVT and NPT .mdp, (All ran without position restraint) , + details of the NVT run For NVT.mdp: integrator = md tinit= 0.0 dt = 0.02 nsteps = 10

Re: [gmx-users] MPI segmentation Fault

2013-01-02 Thread Xu Dong Huang
@Justin, forgot to mention, yes the NVT was ran using MPI as well. (You're right, the MPI is not the issue, because I took the file off my cluster and attempted to run NPT on my personal computer, it reported same segmentation error, fault 11) Xu Dong Huang Chemical & Biochemical Engineering Ru

Re: [gmx-users] MPI segmentation Fault

2013-01-02 Thread Xu Dong Huang
@Justin, here are the details of my NVT and NPT .mdp, (All ran without position restraint) , + details of the NVT run For NVT.mdp: integrator = md tinit= 0.0 dt = 0.02 nsteps = 1 nstcomm = 1 comm-grps

Re: [gmx-users] MPI segmentation Fault

2013-01-02 Thread Justin Lemkul
On 1/2/13 3:20 PM, Xu Dong Huang wrote: Dear gromac users, after examining my nvt run outcome, I see that my molecule has half of the arm stuck outside of the box (I guess my run wasn't very successful). But I thought the jumping outside of the box would be corrected by periodic boundary? Sh

Re: [gmx-users] MPI segmentation Fault

2013-01-02 Thread Xu Dong Huang
Dear gromac users, after examining my nvt run outcome, I see that my molecule has half of the arm stuck outside of the box (I guess my run wasn't very successful). But I thought the jumping outside of the box would be corrected by periodic boundary? Should I run my nvt for longer since the stru

[gmx-users] MPI segmentation Fault

2013-01-02 Thread Xu Dong Huang
Dear gromac users, after running a successful NVT, I was going to run NPT. (All ran with MPI), however when I attempt to run the NPT, I get the following segmentation error: Getting Loaded... Reading file npt.tpr, VERSION 4.5.3 (single precision) Loaded with Money WARNING: This run will gener

Re: [gmx-users] mpi enabled gromacs

2012-10-23 Thread Rajiv Gandhi
I have recomplied the -fPIC and its complied well. I have not used mpi cluster before so have some few doubts on it. Could you please tell me how can i check whether my gromacs installed with enable_mpi. How do i check the cluster are connected in groamcs? Is that same procedure of running simulat

Re: [gmx-users] mpi enabled gromacs

2012-10-22 Thread TH Chew
Hi, I think you need to recompile your mpi library. See the "recompile with -fPIC"? You need to recompile your mpi library with that option. At least that is what I did last time when I install GROMACS. On Oct 22, 2012 5:04 PM, "Rajiv Gandhi" wrote: > Dear Gromacs users, > > I have been trying to

[gmx-users] MPI simulation with CHARMM27 force field and CMAP dihedrals problem

2012-10-02 Thread Koivuniemi, Artturi
Hi, I have tried to simulate a protein in water with the CHARMM27 force field and the GROMACS simulation package. Without the CMAP correction the simulation runs just fine, but when adding the CMAP correction to the dihedrals, the protein quickly starts to unfold and the simulation stops (with

Re: [gmx-users] MPI installation

2012-04-08 Thread Mark Abraham
On 9/04/2012 1:32 PM, TH Chew wrote: Hi, I had similar problem. Still did not manage to compile GROMACS with MPICH2 but OpenMPI works fine. You might want to try that instead. People report problems with MPICH. People rarely report version numbers that work or fail. Nobody's reported an Open

Re: [gmx-users] MPI installation

2012-04-08 Thread TH Chew
Hi, I had similar problem. Still did not manage to compile GROMACS with MPICH2 but OpenMPI works fine. You might want to try that instead. On Mon, Apr 9, 2012 at 11:11 AM, Mark Abraham wrote: > On 8/04/2012 12:41 PM, bharat gupta wrote: > > Hi, > > I am trying to enable mpi fro mdrun in an alre

Re: [gmx-users] MPI installation

2012-04-08 Thread Mark Abraham
On 8/04/2012 12:41 PM, bharat gupta wrote: Hi, I am trying to enable mpi fro mdrun in an already installed gromacs-4.5.5. Using a freshly unpacked tarball will eliminate some sources of problems. Shouldn't be necessary, of course, but since nobody can warrant something hasn't been broken acc

[gmx-users] MPI installation

2012-04-07 Thread bharat gupta
Hi, I am trying to enable mpi fro mdrun in an already installed gromacs-4.5.5. But while executing the command make mdrun , I am getting the following errorn:- mv -f .deps/xlate.Tpo .deps/xlate.Plo /bin/sh ../../libtool --tag=CC --mode=link mpicc -O3 -fomit-frame-pointer -finline-functions -Wal

Re: [gmx-users] MPI Scaling Issues

2012-02-03 Thread Mark Abraham
On 4/02/2012 5:20 AM, Christoph Klein wrote: Hi all, I am running a water/surfactant system with just under 10 atoms using MPI on a local cluster and not getting the scaling I was hoping for. The cluster consists of 8 core xeon nodes and I'm running gromacs 4.5 with mpich2-gnu. I've tried

[gmx-users] MPI Scaling Issues

2012-02-03 Thread Christoph Klein
Hi all, I am running a water/surfactant system with just under 10 atoms using MPI on a local cluster and not getting the scaling I was hoping for. The cluster consists of 8 core xeon nodes and I'm running gromacs 4.5 with mpich2-gnu. I've tried running a few benchmarks using 100ps runs and get

Re: [gmx-users] mpi run in Gromacs

2011-03-02 Thread Mark Abraham
On 2/03/2011 11:37 PM, Erik Marklund wrote: Selina Nawaz skrev 2011-03-02 11.54: Hi, I am a Phd student studying polymer membranes. I have installed Gromacs version 4.5.2 on our department cluster and attempted to run a simulations on a DPPC bilayer membranes taken from the tielleman website.

Re: [gmx-users] mpi run in Gromacs

2011-03-02 Thread Erik Marklund
Selina Nawaz skrev 2011-03-02 11.54: Hi, I am a Phd student studying polymer membranes. I have installed Gromacs version 4.5.2 on our department cluster and attempted to run a simulations on a DPPC bilayer membranes taken from the tielleman website. I have set the system up in an NPT ensemble

[gmx-users] mpi run in Gromacs

2011-03-02 Thread Selina Nawaz
Hi, I am a Phd student studying polymer membranes. I have installed Gromacs version 4.5.2 on our department cluster and attempted to run a simulations on a DPPC bilayer membranes taken from the tielleman website. I have set the system up in an NPT ensemble using a semi-isotropic pressure coupli

RE: [gmx-users] mpi run in Gromacs 4.5.3

2010-12-13 Thread Te, Jerez A., Ph.D.
users Subject: Re: [gmx-users] mpi run in Gromacs 4.5.3 Te, Jerez A., Ph.D. wrote: > Hi Mark, > > Thank you for your reply. Just to confirm, mdrun_mpi is still being used in > Gromacs 4.5.3? The gromacs manual (Appendix A.5- Running Gromacs in parallel) > suggested using "mdrun -

Re: [gmx-users] mpi run in Gromacs 4.5.3

2010-12-13 Thread Mark Abraham
one, --enable-mpi. Obviously you need a correctly-configured MPI compiler and environment for it to work Mark Thanks, JT -Original Message- From: gmx-users-boun...@gromacs.org on behalf of Mark Abraham Sent: Mon 12/13/2010 4:38 PM To: Discussion list for GROMACS users Subject: Re: [gmx-users]

Re: [gmx-users] mpi run in Gromacs 4.5.3

2010-12-13 Thread Justin A. Lemkul
ing support, the -nt option will be printed if you issue mdrun -h. -Justin Thanks, JT -Original Message- From: gmx-users-boun...@gromacs.org on behalf of Mark Abraham Sent: Mon 12/13/2010 4:38 PM To: Discussion list for GROMACS users Subject: Re: [gmx-users] mpi run in Gromacs 4.5.3 O

RE: [gmx-users] mpi run in Gromacs 4.5.3

2010-12-13 Thread Te, Jerez A., Ph.D.
ROMACS users Subject: Re: [gmx-users] mpi run in Gromacs 4.5.3 On 14/12/2010 7:48 AM, Te, Jerez A., Ph.D. wrote: > > Hi, > I have been trying to run Gromacs 4.5.3 parallel simulations using > openmpi 1.4.2. From my understanding, mdrun_mpi is not used in this > version of Gromacs. &g

Re: [gmx-users] mpi run in Gromacs 4.5.3

2010-12-13 Thread Mark Abraham
On 14/12/2010 7:48 AM, Te, Jerez A., Ph.D. wrote: Hi, I have been trying to run Gromacs 4.5.3 parallel simulations using openmpi 1.4.2. From my understanding, mdrun_mpi is not used in this version of Gromacs. I don't understand what (you think) you mean. You can use thread-based paralleli

[gmx-users] mpi run in Gromacs 4.5.3

2010-12-13 Thread Te, Jerez A., Ph.D.
Hi, I have been trying to run Gromacs 4.5.3 parallel simulations using openmpi 1.4.2. From my understanding, mdrun_mpi is not used in this version of Gromacs. Our system administrator told me that all mpi related options have been turned on while installing Gromacs. With either commands: mdrun -

RE: [gmx-users] MPI and dual-core laptop

2010-09-28 Thread Berk Hess
Hi, You don't need MPI within one machine. Gromacs 4.5 has a built in thread-mpi library that gets built automatically. Berk > Date: Tue, 28 Sep 2010 10:22:30 +0200 > From: domm...@icp.uni-stuttgart.de > To: gmx-users@gromacs.org > Subject: Re: [gmx-users] MPI and

Re: [gmx-users] MPI and dual-core laptop

2010-09-28 Thread Florian Dommert
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 On 09/27/2010 08:42 PM, simon sham wrote: > Hi, > I wanted to test the GROMACS MPI version in my dual-processors laptop. I > have installed openmpi 1.4.2 version. However, when I tried to configure > GROMACS 4.5.1 with --enable-mpi option, I got the fo

Re: [gmx-users] MPI and dual-core laptop

2010-09-28 Thread Carsten Kutzner
Hi, if you only want to use the two processors of you laptop you can simple leave away the --enable-mpi flag. Then it will work in parallel using threads. Use mdrun -nt 2 -s ... to specify two threads. If you anyhow want to compile with MPI, take a look at the config.log file (search for 'Cannot

Re: [gmx-users] MPI and dual-core laptop

2010-09-27 Thread Gonçalo C . Justino
Hi, My 2 cents on your problem: I've been running gromacs on nearly everything, from one to six cores. It works. Besides mpi being in your /usr/local/bin, is it on the path of your system? You can do, in the terminal export PATH=$PATH:/usr/local/bin Or, for a more permanent solution, go to your

[gmx-users] MPI and dual-core laptop

2010-09-27 Thread simon sham
Hi, I wanted to test the GROMACS MPI version in my dual-processors laptop. I have installed openmpi 1.4.2 version. However, when I tried to configure GROMACS 4.5.1 with --enable-mpi option, I got the following configuration problem: "checking whether the MPI cc command works... configure: error:

Re: [gmx-users] MPI

2010-09-25 Thread Justin A. Lemkul
Florian Dommert wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 On 09/24/2010 09:18 PM, Justin A. Lemkul wrote: Guess my reply never hit the list, either, but it's in the archive: http://lists.gromacs.org/pipermail/gmx-users/2010-September/054259.html -Justin Besides Justins answer p

Re: [gmx-users] MPI

2010-09-25 Thread Florian Dommert
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 On 09/24/2010 09:18 PM, Justin A. Lemkul wrote: > > Guess my reply never hit the list, either, but it's in the archive: > > http://lists.gromacs.org/pipermail/gmx-users/2010-September/054259.html > > -Justin Besides Justins answer perhaps you shoul

Re: [gmx-users] MPI

2010-09-24 Thread Justin A. Lemkul
Guess my reply never hit the list, either, but it's in the archive: http://lists.gromacs.org/pipermail/gmx-users/2010-September/054259.html -Justin simon sham wrote: Hi, This is the same email that I'd sent yesterday but has not got posted? Hi, I have some questions about installation of GROM

[gmx-users] MPI

2010-09-24 Thread simon sham
Hi, This is the same email that I'd sent yesterday but has not got posted? Hi, I have some questions about installation of GROMACS with MPI. Our system administrator has installed the 4.5.1 version in our system. When I tested the software with "openmpi/openmpi-1.3.3-gcc/bin/mpirun" with mdrun_mpi,

Re: [gmx-users] mpi installation problems

2010-07-29 Thread Carsten Kutzner
Hi Ryan, you need to use mpiicc instead of mpicc (should be present also in /opt/intel/mpi/3.1/bin64/). Carsten On Jul 28, 2010, at 11:32 PM, Ryan S Davis (rsdavis1) wrote: > I am trying to install gromacs with MPI enabled on a cluster but it seems > like fftw is giving me trouble. > > Fir

[gmx-users] mpi installation problems

2010-07-28 Thread Ryan S Davis (rsdavis1)
I am trying to install gromacs with MPI enabled on a cluster but it seems like fftw is giving me trouble. First, I compile everything without MPI just fine. FFTW is already on the cluster. I export pertinent variables... export CPPFLAGS=-I/opt/fftw/3.2.1/include export LDFLAGS=-L/opt/fftw/3.2.

Re: RE: [gmx-users] mpi run

2010-07-08 Thread Mark Abraham
- Original Message - From: #ZHAO LINA# Date: Thursday, July 8, 2010 18:53 Subject: RE: [gmx-users] mpi run To: Discussion list for GROMACS users P {margin-top:0;margin-bottom:0;} --- | > During your installation, if

RE: [gmx-users] mpi run

2010-07-08 Thread #ZHAO LINA#
: Discussion list for GROMACS users Subject: Re: [gmx-users] mpi run Hi Mahmoud, for anyone to be able to help you, you need to provide a lot more information, at least: - which mpi library are you using? - how did you compile and/or install Gromacs? - what commands do you use to run mdrun and what was the

Re: [gmx-users] mpi run

2010-07-08 Thread Carsten Kutzner
Hi Mahmoud, for anyone to be able to help you, you need to provide a lot more information, at least: - which mpi library are you using? - how did you compile and/or install Gromacs? - what commands do you use to run mdrun and what was the output of it? Best, Carsten On Jul 8, 2010, at 9:41 AM

[gmx-users] mpi run

2010-07-08 Thread nanogroup
Dear GMX Users, I have a PC with 4 CPU, but the Gromacs only use one CPU. the command of mpiru works on linux; however, the command of mdrun_mpi does not work. Would you please help me how to set up the mdrun_mpi in Gromacs 4.0.4 Many thanks, Mahmoud -- gmx-users mailing listgmx

Re: [gmx-users] mpi-run2

2010-05-31 Thread Jussi Lehtola
On Mon, 2010-05-31 at 07:38 -0700, nanogroup wrote: > Dear Justin, > > Let me tell the details: > > I have a PC with 4 CPU, > > Fedora 11 x86_64 is installed, > > The rpm files of Gromacs 4 are installed, > > now, I want to configure the gromacs to use all 4 CPUs, > > At the end of configurat

[gmx-users] mpi-run2

2010-05-31 Thread nanogroup
Dear Justin, Let me tell the details: I have a PC with 4 CPU, Fedora 11 x86_64 is installed, The rpm files of Gromacs 4 are installed, now, I want to configure the gromacs to use all 4 CPUs, At the end of configuration process, it says that the FFTW could not be found! The fftw files are ins

Re: [gmx-users] mpi-run

2010-05-30 Thread Justin A. Lemkul
nanogroup wrote: Dear GMX Users, I want to run Gromacs on a multiprocessor PC. The MPI files are correctly installed and the gromacs is also configured. However, at the end of configuration section, an Error appears that the FFTW can not be found! Indeed, the FFTW is already installed but

[gmx-users] mpi-run

2010-05-30 Thread nanogroup
Dear GMX Users, I want to run Gromacs on a multiprocessor PC. The MPI files are correctly installed and the gromacs is also configured. However, at the end of configuration section, an Error appears that the FFTW can not be found! Indeed, the FFTW is already installed but the configuration can

Re: [gmx-users] MPI problem------Cannot compile and link MPI code with cc

2010-03-29 Thread Gavin Melaugh
Hi guys I have been running gromacs for a few months now with no real problems however some of my numbers differ from my supervisor's who uses DL-POLY. Anyway discussion of the discrpnacies has led me to questioning aspects of my topology file below (exerpts). We created the topology ourselves. I

Re: [gmx-users] MPI problem——Cannot compi le and link MPI code with cc

2010-03-29 Thread Mark Abraham
On 29/03/2010 10:24 PM, DreamCatcher wrote: Hello gmx-users, I am trying to install a MPI version of Gromacs that I come across some problems as follows: [cele...@celeste gromacs-4.0.7]$ sudo ./configure --prefix=/home/programmes/gromacs --disable-float --enable-mpi Don't use sudo for c

[gmx-users] MPI problem——Cannot compile and link MPI code with cc

2010-03-29 Thread DreamCatcher
Hello gmx-users, I am trying to install a MPI version of Gromacs that I come across some problems as follows: [cele...@celeste gromacs-4.0.7]$ sudo ./configure --prefix=/home/programmes/gromacs --disable-float --enable-mpi checking build system type... i686-pc-linux-gnu checking host system

Re: [gmx-users] MPI ERROR while installing GMX4.0.7

2010-01-15 Thread Chandan Choudhury
Dear Nuno thanks for your helpful suggestion. It worked successful. But the following command mpirun -np 2 mdrun_mpi_d -v -s em_1.tpr -c em_1.pdb doesnt seem to run parallely. I have mentiond the output below. ban...@corsica:~/CKC/L2PJR> mpirun -np 2 mdrun_mpi_d -v -s em_1.t

Re: [gmx-users] MPI ERROR while installing GMX4.0.7

2010-01-15 Thread Nuno Azoia
Hello! I had the same problem, and for me the solution was to set up the openmpi/lib directory. The compiler is not able to find it alone. I'm using bash, so for me the solution was: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/your/directory/openmpi/lib of course you have to change "/your/director

[gmx-users] MPI ERROR while installing GMX4.0.7

2010-01-14 Thread Chandan Choudhury
Hello gmx-users !! I am trying to install gromacs 4.0.7 double precision with mpi. I downloaded openmpi-1.4. and installed it. Then executed ./configure --enable-mpi --program-suffix=_mpi_d --prefix=/usr/local/gromacs407_double_mpi --enable-double It showed checking for mpicc... mpicc checking w

Re: [gmx-users] mpi code in the gromacs utilities

2009-12-29 Thread Justin A. Lemkul
Vitaly V. Chaban wrote: Hi, I wonder to know which of the gromacs programs except mdrun support parallel execution? Occasionally I tried g_energy and it seems not to support MPI, so it looks like the each node performs the same job... That's because only mdrun is MPI-enabled. I believe

[gmx-users] mpi code in the gromacs utilities

2009-12-29 Thread Vitaly V. Chaban
Hi, I wonder to know which of the gromacs programs except mdrun support parallel execution? Occasionally I tried g_energy and it seems not to support MPI, so it looks like the each node performs the same job... -- Vitaly V. Chaban, Ph.D. http://www-rmn.univer.kharkov.ua/chaban.html -- gmx-users

Re: [gmx-users] mpi version of gromacs

2009-09-15 Thread Justin A. Lemkul
Amit Choubey wrote: ok here the pbs file for the job #!/bin/bash #PBS -l nodes=1:ppn=1 #PBS -l walltime=00:01:59 #PBS -o output.out #PBS -j oe #PBS -N mdrun WORK_HOME=/auto/hpc-08/knomura/choubey/GROMACS/test cd $WORK_HOME source /usr/usc/mpich/default/setup.sh mdrun_mpi -v -s em -

Re: [gmx-users] mpi version of gromacs

2009-09-15 Thread Amit Choubey
ok here the pbs file for the job #!/bin/bash #PBS -l nodes=1:ppn=1 #PBS -l walltime=00:01:59 #PBS -o output.out #PBS -j oe #PBS -N mdrun WORK_HOME=/auto/hpc-08/knomura/choubey/GROMACS/test cd $WORK_HOME source /usr/usc/mpich/default/setup.sh mdrun_mpi -v -s em -o em -c after_em -g emlog Is my comm

Re: [gmx-users] mpi version of gromacs

2009-09-15 Thread Justin A. Lemkul
Amit Choubey wrote: Hi everyone, I have the gromacs mpi version on my system and everything in the simulation works fine until i get to mdrun_mpi and when i use this executable the following pops error comes up MX:hpc-login2:mx_init:querying driver:error 5(errno=2):No MX device entry in /

[gmx-users] mpi version of gromacs

2009-09-15 Thread Amit Choubey
Hi everyone, I have the gromacs mpi version on my system and everything in the simulation works fine until i get to mdrun_mpi and when i use this executable the following pops error comes up MX:hpc-login2:mx_init:querying driver:error 5(errno=2):No MX device entry in /dev. Then i tried the to do

Re: [gmx-users] MPI on Windows

2009-09-02 Thread Justin A. Lemkul
ednesday, September 02, 2009 3:47 PM Subject: Re: [gmx-users] MPI on Windows George Tsigaridas wrote: Hi all I would like to ask if it is possible to use MPI when GROMACS is installed on Windows using Cygwin. If yes, is there any special procedure that I should follow in this case or I just

Re: [gmx-users] MPI on Windows

2009-09-02 Thread George Tsigaridas
system? Thank you in advance George - Original Message - From: "Mark Abraham" To: "Discussion list for GROMACS users" Sent: Wednesday, September 02, 2009 3:47 PM Subject: Re: [gmx-users] MPI on Windows George Tsigaridas wrote: Hi all I would like to ask if it is

Re: [gmx-users] MPI on Windows

2009-09-02 Thread Mark Abraham
George Tsigaridas wrote: Hi all I would like to ask if it is possible to use MPI when GROMACS is installed on Windows using Cygwin. If yes, is there any special procedure that I should follow in this case or I just perform standard installation with MPI support? Also, which MPI driver should

[gmx-users] MPI on Windows

2009-09-02 Thread George Tsigaridas
Hi all I would like to ask if it is possible to use MPI when GROMACS is installed on Windows using Cygwin. If yes, is there any special procedure that I should follow in this case or I just perform standard installation with MPI support? Also, which MPI driver should I use? Is Open MPI suitable

Re: [gmx-users] mpi mdrun

2009-06-18 Thread Justin A. Lemkul
jayant james wrote: Hi! After installing GMX without the mpi Igive the following commands make clean ./configure --enable-mpi --disable-nice --program-suffix="_mpi" I am getting this problem when I give the --enable-mpi option checking size of int... configure: error: cannot compute sizeo

Re: [gmx-users] mpi mdrun

2009-06-18 Thread jayant james
Hi! After installing GMX without the mpi Igive the following commands make clean ./configure --enable-mpi --disable-nice --program-suffix="_mpi" I am getting this problem when I give the --enable-mpi option checking build system type... x86_64-unknown-linux-gnu checking host system type... x86_64

Re: [gmx-users] mpi mdrun

2009-06-18 Thread Mark Abraham
jayant james wrote: Hi! Thanks for your reply. So I suppose I need to have a hostfile in my directory and call it during the mpirun command. I have one clarification, since I am using a quad core how am I to list the processors on the host file? would it need to open a file name \d hostfile

Re: [gmx-users] mpi mdrun

2009-06-18 Thread Jussi Lehtola
On Thu, 2009-06-18 at 10:03 -0700, jayant james wrote: > Hi! > Thanks for your reply. So I suppose I need to have a hostfile in my > directory and call it during the mpirun command. I have one > clarification, since I am using a quad core how am I to list the > processors on the host file? would i

Re: [gmx-users] mpi mdrun

2009-06-18 Thread jayant james
Hi! Thanks for your reply. So I suppose I need to have a hostfile in my directory and call it during the mpirun command. I have one clarification, since I am using a quad core how am I to list the processors on the host file? would it need to open a file name \d hostfile and have a list for exampl

Re: [gmx-users] mpi mdrun

2009-06-17 Thread Mark Abraham
jayant james wrote: Hi! Oh!! I see that nnodes: 1. So does that mean that the job I gave is not running on four processors? If so how am I to solve this problem? You haven't configured your MPI system with a suitable hostfile/whatever for your machine, probably. Mark ___

Re: [gmx-users] mpi mdrun

2009-06-17 Thread jayant james
Hi! Oh!! I see that nnodes: 1. So does that mean that the job I gave is not running on four processors? If so how am I to solve this problem? thanks JJ On Wed, Jun 17, 2009 at 9:10 PM, Mark Abraham wrote: > jayant james wrote: > >> Hi Mark! >> Thanks for the tip I got it the mpi mdrun running on

Re: [gmx-users] mpi mdrun

2009-06-17 Thread Mark Abraham
jayant james wrote: Hi Mark! Thanks for the tip I got it the mpi mdrun running on my quad core machine. I just have one small clarification. In the output file md.log I see this message "Started mdrun at node (0)" I monitor my processor's load using gkrellm to see how many are running. Whe

Re: [gmx-users] mpi mdrun

2009-06-17 Thread jayant james
Hi Mark! Thanks for the tip I got it the mpi mdrun running on my quad core machine. I just have one small clarification. In the output file md.log I see this message "Started mdrun at node (0)" I monitor my processor's load using gkrellm to see how many are running. When I started the mdrun ( mp

Re: [gmx-users] mpi mdrun

2009-06-16 Thread Mark Abraham
jayant james wrote: Hi !! I am attempting to install mpi mdrun such that I can use all four processors of my quad core system. But I keep running into this problem!! My operating system is Suse 10.1. (cd .libs && rm -f libgmxpreprocess_mpi.la && ln -s ../li

[gmx-users] mpi mdrun

2009-06-16 Thread jayant james
Hi !! I am attempting to install mpi mdrun such that I can use all four processors of my quad core system. But I keep running into this problem!! My operating system is Suse 10.1. (cd .libs && rm -f libgmxpreprocess_mpi.la && ln -s ../libgmxpreprocess_mpi.la libgmxpreprocess_mpi.la) make[1]: ***

Re: [gmx-users] MPI gromacs under cygwin

2009-06-16 Thread Mark Abraham
Dmitri Dubov wrote: Hi, all! Is are any who uses GMX in MPI mode under cygwin? I'm trying to install it. Based on oldgromacs site and wiki I installed cygwin, FFTW and LAM/MPI. Unfortunately I failed to install libaio and leaved it out. But when configuring gromacs with ./configure --ena

[gmx-users] MPI gromacs under cygwin

2009-06-16 Thread Dmitri Dubov
Hi, all! Is are any who uses GMX in MPI mode under cygwin? I'm trying to install it. Based on oldgromacs site and wiki I installed cygwin, FFTW and LAM/MPI. Unfortunately I failed to install libaio and leaved it out. But when configuring gromacs with ./configure --enable-mpi --disable-float --

Re: [gmx-users] mpi problem during installation

2009-05-27 Thread Mark Abraham
Itamar Kass wrote: HI all, I am trying to compile GROMACS 4.0.5 on my mac (10.5) using './configure --enable-mpi --disable-float --with-fft=no && make -j2 && make install'. I installed on the system lam 7.0.6 './configure && make && make install'. The error message I get is: mpicc -O3 -fomit-f

[gmx-users] mpi problem during installation

2009-05-27 Thread Itamar Kass
HI all, I am trying to compile GROMACS 4.0.5 on my mac (10.5) using './configure --enable-mpi --disable-float --with-fft=no && make -j2 && make install'. I installed on the system lam 7.0.6 './configure && make && make install'. The error message I get is: mpicc -O3 -fomit-frame-pointer -finline

[gmx-users] mpi problem

2009-05-06 Thread jagannath mondal
Hi,I am trying to use parallel gromacs3.3.3  mdrun programme  .I have installed mpich2 in our quadcore machine. But I am having 3 problems:   1. The scaling is very poor among 4 cores: varies between 30-70 %.    2. If I run the mdrun_mpi in background, the output log file cites following error: 

  1   2   >