Dear Jacob,
As an editor I am always mindful that an article is finally under the
authors' names. That said the reader always deserves to know at what
diffraction resolution average intensities (cease to) exist. The usual
statistical practice to do that is to use a given quantity's (ie in
this case a reflection intensity) sigma.

Good effort is made in data processing programs to protect the quality
of the estimate of each reflection intensity's sigma notably the chi
square test.

Thus I request that the diffraction resolution where <I/sig(I)>
crosses 2.0 is quoted in the article, if it is not there already. I
agree that 2.0 is arbitrary but it is more 'lenient' than the usual
'3sigma' statistical test.

Sometimes the title or abstract has to be changed to follow this
criterion; eg 'The structure of xxx is determined to 2.4 Angstrom
resolution' type of title has to be consistent with the above
criterion.

I do not follow an 'Rmerge must be less than x% rule'.

I think the above follows reasonable general statistical practice,
whilst permitting authors reasonable freedom, and also protects the
(more innocent) readers of articles.

I am aware that the 'correlation coefficient' between randomly
portioned parts of data sets is being increasingly discussed, this
parameter also having general statistical validity. I am monitoring
discussion on this carefully. It has long been a good way of assessing
the statistical quality of anomalous differences for example; to my
knowledge introduced by Michael Rossmann many years ago.

Best wishes,
John


On Fri, Jan 27, 2012 at 5:55 PM, Jacob Keller
<j-kell...@fsm.northwestern.edu> wrote:
> Clarification: I did not mean I/sigma of 2 per se, I just meant
> I/sigma is more directly a measure of signal than R values.
>
> JPK
>
> On Fri, Jan 27, 2012 at 11:47 AM, Jacob Keller
> <j-kell...@fsm.northwestern.edu> wrote:
>> Dear Crystallographers,
>>
>> I cannot think why any of the various flavors of Rmerge/meas/pim
>> should be used as a data cutoff and not simply I/sigma--can somebody
>> make a good argument or point me to a good reference? My thinking is
>> that signal:noise of >2 is definitely still signal, no matter what the
>> R values are. Am I wrong? I was thinking also possibly the R value
>> cutoff was a historical accident/expedient from when one tried to
>> limit the amount of data in the face of limited computational
>> power--true? So perhaps now, when the computers are so much more
>> powerful, we have the luxury of including more weak data?
>>
>> JPK
>>
>>
>> --
>> *******************************************
>> Jacob Pearson Keller
>> Northwestern University
>> Medical Scientist Training Program
>> email: j-kell...@northwestern.edu
>> *******************************************
>
>
>
> --
> *******************************************
> Jacob Pearson Keller
> Northwestern University
> Medical Scientist Training Program
> email: j-kell...@northwestern.edu
> *******************************************



-- 
Professor John R Helliwell DSc

Reply via email to