THe error message is clear: your spin multiplicity is 0, which is
impossible.
Please make sure you understand the basics of electronic structure
theory. To test this, you can run the QM system only in stand along QM
package.
gerrit
2. Re: orca and Segmentation fault (xi zhao)
3. RE: Potential Energy Landscape (Natalie Stephenson)
-----------------
./configure --without-qmmm-orca --without--qmmm-gaussian --enable-mpi
make
make install
I installed gromacs with Parallel  mode, is not threading. when I run  " mpirun
-np 1 mdrun_dd -v -s pyp.tpr&" or mdrun_dd -nt 1Â -v -s pyp.tpr
it still"
Back Off! I just backed up md.log to ./#md.log.20#
Getting Loaded...
Reading file pyp.tpr, VERSION 4.5.1 (single precision)
Loaded with Money
QM/MM calculation requested.
there we go!
Layer 0
nr of QM atoms 22
QMlevel: B3LYP/3-21G
orca initialised...
Back Off! I just backed up traj.trr to ./#traj.trr.1#
Back Off! I just backed up traj.xtc to ./#traj.xtc.1#
Back Off! I just backed up ener.edr to ./#ener.edr.2#
starting mdrun 'PHOTOACTIVE YELLOW PROTEIN in water'
500 steps,     0.5 ps.
Calling 'orca pyp.inp>> pyp.out'
Error : multiplicity (Mult:=2*S+1) is zero
-------------------------------------------------------
Program mdrun_dd, VERSION 4.5.1
Source code file: qm_orca.c, line: 393
Fatal error:
Call to 'orca pyp.inp>> pyp.out' failed
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------
"The Carpenter Goes Bang Bang" (The Breeders)
Halting program mdrun_dd
gcq#129: "The Carpenter Goes Bang Bang" (The Breeders)
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode -1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 18080 on
node localhost.localdomai exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here)."
--- 11年11月14日,周一, Christoph Riplinger<c...@thch.uni-bonn.de>
写�:
�件人: Christoph Riplinger<c...@thch.uni-bonn.de>
主题: Re: [gmx-users] orca and Segmentation fault
收件人: "Discussion list for GROMACS users"<gmx-users@gromacs.org>
日期: 2011年11月14日,周一,下�6:51
--
gmx-users mailing list gmx-users@gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-requ...@gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists