Agree with Pavel.
Something I think worth adding is a reminder that the MolProbity score
only looks at bad clashes, ramachandran and rotamer outliers.
MPscore=0.426∗ln(1+clashscore)+0.33∗ln(1+max(0,rota_out−1))+0.25∗ln(1+max(0,rama_iffy−2))+0.5
It pays no attention whatsoever to twisted peptide bonds, C-beta
deviations, and, for that matter, bond lengths and bond angles. If you
tweak your weights right you can get excellent MP scores, but horrible
"geometry" in the traditional bonds-and-angles sense. The logic behind
this kind of validation is that normally nonbonds and torsions are much
softer than bond and angle restraints and therefore fertile ground for
detecting problems. Thus far, I am not aware of any "Grand Unified
Score" that combines all geometric considerations, but perhaps it is
time for one?
Tristan's trivial solution aside, it is actually very hard to make all
the "geometry" ideal for a real-world fold, and especially difficult to
do without also screwing up the agreement with density (R factor). I
would argue that if you don't have an R factor then you should get one,
but I am interested in opinions about alternatives.
I.E. What if we could train an AI to predict Rfree by looking at the
coordinates?
-James Holton
MAD Scientist
On 12/21/2021 9:25 AM, Pavel Afonine wrote:
Hi Reza,
If you think about it this way... Validation is making sure that the
model makes sense, data make sense and model-to-data fit make sense,
then the answer to your question is obvious: in your case you do not
have experimental data (at least in a way we used to think of it) and
so then of these three validation items you only have one, which, for
example, means you don’t have to report things like R-factors or
completeness in high-resolution shell.
Really, the geometry of an alpha helix does not depend on how you
determined it: using X-rays or cryo-EM or something else! So, most (if
not all) model validation tools still apply.
Pavel
On Mon, Dec 20, 2021 at 8:10 AM Reza Khayat <rkha...@ccny.cuny.edu> wrote:
Hi,
Can anyone suggest how to validate a predicted structure?
Something similar to wwPDB validation without the need for
refinement statistics. I realize this is a strange question given
that the geometry of the model is anticipated to be fine if the
structure was predicted by a server that minimizes the geometry to
improve its statistics. Nonetheless, the journal has asked me for
such a report. Thanks.
Best wishes,
Reza
Reza Khayat, PhD
Associate Professor
City College of New York
Department of Chemistry and Biochemistry
New York, NY 10031
------------------------------------------------------------------------
To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB&A=1
<https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB&A=1>
------------------------------------------------------------------------
To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB&A=1
<https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB&A=1>
########################################################################
To unsubscribe from the CCP4BB list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCP4BB&A=1
This message was issued to members of www.jiscmail.ac.uk/CCP4BB, a mailing list
hosted by www.jiscmail.ac.uk, terms & conditions are available at
https://www.jiscmail.ac.uk/policyandsecurity/