Hi Frank,
Off from the original topic but important to clarify. If I misled the
concepts, I apologize.
Outer shell Rmerge will always be very high:
----------
True! Especially when I/Sig ~ 1 or less.
Only I/sigI (and completeness, although it's related) is really
relevant for deciding high resolution cutoff.
---------
Normally I use I/Sig = 2.0 for res-cut-off. For this "accuracy"---
please do not ask me the exact meaning of Sig(too many contributed
this including hardware, software, protocol, strategies,...), the
average measuring error for reflections could be expected to the
inversion of this number, 1/2.0, i.e. 50%, which in general suggests
that the Rmerge should not pass much this value to make the inclusion
of the data meaningful. (Please read this carefully since I do not
want to confuse two different concepts). Or you are merging data
with merging error much larger than the data measuring error.
Although the estimation of Sig(I) is difficult and Sig(I) itself may
be of large error, when I/sig ~ 3, 70% seems still to be too high to
accept.
Rmerge is well known to be a weak indicator, but it is not just a
mathematical issue, and never a crap. It should be used with others
(I/S, red, ...). I agree with Ian that all data should be included,
if the quality is guaranteed.
I did not comb the history of refinement softwares and their
philosophy, but today it seems all the prevailing ref-packages use
resolution bins for shelling (I know there has been enough theoretical
ground to to so), which is the source of RESOLUTION CUTOFF and some
problems arisen from RESOLUTION CUTOFF for example the Rmerge issue.
I appreciate to be told if some softwares had ever used I, I/SigI, F,
F/SigF or something else for binning, especially in the early time for
refinement package development. RESOLUTION BINNING might not be a
have-to? :D
Best regards.
Lijun Liu, PhD
http://www.uoregon.edu/~liulj/