Re: [gmx-users] limit of number of processors ??

2006-12-21 Thread David van der Spoel
이 선주 wrote: > Hello All, > > I have been doing membranes simulations using the computer nodes in > UTEXAS supercomputing center. Each board contains two Xeon Intel > Duo-Core 64-bit processors (4 cores in all) and nodes are interconnected > with InfiniBand technology. > The performance is great

[gmx-users] limit of number of processors ??

2006-12-21 Thread 이 선주
Hello All, I have been doing membranes simulations using the computer nodes in UTEXAS supercomputing center. Each board contains two Xeon Intel Duo-Core 64-bit processors (4 cores in all) and nodes are interconnected with InfiniBand technology. The performance is great. However, I have got w

Re: [gmx-users] do_dssp

2006-12-21 Thread David van der Spoel
Tsjerk Wassenaar wrote: Hi Gurpreet, The common message is "don't worry, mdrun doesn't write broken molecules". So you're secondary structure will not be affected by periodic boundary conditions. Cheers, Tsjerk On 12/21/06, singh <[EMAIL PROTECTED]> wrote: But there could be a problem for b

Re: [gmx-users] do_dssp

2006-12-21 Thread Tsjerk Wassenaar
Hi Gurpreet, The common message is "don't worry, mdrun doesn't write broken molecules". So you're secondary structure will not be affected by periodic boundary conditions. Cheers, Tsjerk On 12/21/06, singh <[EMAIL PROTECTED]> wrote: Dear Gromacs users, I have simulated 12 peptide f

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread Martin Höfling
Am Donnerstag, 21. Dezember 2006 08:41 schrieb Seaclear Theory: Hi Ocean, > When I run mdrun, I got message that lamboot is not start. Then I start the > lamboot and try to run mdrun witn "-np 4 -v N 4" (gromacs user guide try mpirun -np 4 mdrun -np4 -bla -blubb This should be necessary on mos

[gmx-users] do_dssp

2006-12-21 Thread singh
Dear Gromacs users, I have simulated 12 peptide fragments in a cubic box and I want to use do_dssp for secondary structure analysis but I am not sure whether Periodic boundary conditions will be taken in account during assignment (particularly for assigning betasheets). Regards, Gurp

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread Mark Abraham
Seaclear Theory wrote: I do not understand why you are so upset. Are you the author of the user manual :-) Any way, let's go back for the issue, not person. I'm not upset - I'm just communicating directly :-) I do get a little irritated when someone wanting help doesn't describe their probl

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread Seaclear Theory
Will work on it. Thanks. Ocean On 12/21/06, David van der Spoel <[EMAIL PROTECTED]> wrote: Seaclear Theory wrote: > I believed the case that single machine with multiple processors (like > Dell server with 2 due core CPUs) should be very popular for end user. > Because not everyone can offer a

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread David van der Spoel
Seaclear Theory wrote: I believed the case that single machine with multiple processors (like Dell server with 2 due core CPUs) should be very popular for end user. Because not everyone can offer a 32 nodes cluster or has access to supercomputer center. Could we have detail tutorial for "run

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread Seaclear Theory
I believed the case that single machine with multiple processors (like Dell server with 2 due core CPUs) should be very popular for end user. Because not everyone can offer a 32 nodes cluster or has access to supercomputer center. Could we have detail tutorial for "run gromacs on single machine

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread Seaclear Theory
Thanks for your help very much. Best, Ocean On 12/21/06, Yang Ye <[EMAIL PROTECTED]> wrote: normally we use following commands when compiling gromacs ./configure ... --program_suffix=_mpi --enable-mpi ... So we will have mdrun_mpi in the end. The usage of mdrun_mpi is the same as mdrun but

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread David van der Spoel
Yang Ye wrote: normally we use following commands when compiling gromacs ./configure ... --program_suffix=_mpi --enable-mpi ... So we will have mdrun_mpi in the end. The usage of mdrun_mpi is the same as mdrun but -np, -replex and other switches are activated. That statement on the user manu

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread Yang Ye
normally we use following commands when compiling gromacs ./configure ... --program_suffix=_mpi --enable-mpi ... So we will have mdrun_mpi in the end. The usage of mdrun_mpi is the same as mdrun but -np, -replex and other switches are activated. That statement on the user manual is to clarify

Re: [gmx-users] run gromacs on a single machine with mutiple CPUs

2006-12-21 Thread Seaclear Theory
On 12/20/06, Mark Abraham <[EMAIL PROTECTED]> wrote: Seaclear Theory wrote: > > > On 12/20/06, *Mark Abraham* <[EMAIL PROTECTED] > > wrote: > > Seaclear Theory wrote: > > Hi! All, > > > > I have a linux server with 4 CPUs. How can I run gromacs on all