Hi Thomas,

strictly speaking, there is no need to re-process the data (because it wouldn't 
change the data anyway). It may even be that the data beyond 2.55A contain some 
signal (CC1/2 should tell you about that), which future refinement programs can 
use - so you could deposit at the RCSB all data out to 2.35A (if that is where 
CC1/2 becomes insignificant), and future PDB_REDO calculations could make use 
of that.

What most people still do, however, is that they make an attempt (for the 
infamous Table 1 ...) to match the high-resolution cutoff of the data 
processing to the high-resolution cutoff of refinement. I say "attempt" because 
the width of the highest-resolution shell (of the data processing and the 
refinement program) in most cases then still does _not_ match. And AFAIK there 
is no formal requirement of any journal to have the same cutoff. But having the 
same cutoff is based on the old and flawed assumption that Rmerge in the 
highest resolution shell tells you something about the "quality" of the data 
being used for refinement. 

What I personally do is to adjust the cutoff in data processing such that CC1/2 
in the highest-resolution shell is still significant at the 1-in-1000 level 
(this information is available from the XDS output). I stick with this dataset 
which has all the information about the crystal and is future-proof in the 
sense mentioned above. I then do paired refinement to find out what the best 
high-resolution cutoff is, and I find that this depends to some extent on the 
refinement program and the quality of the model. So I might end up with a 
slightly lower high-resolution cutoff in refinement than I do in data 
processing, but I (and usually the reviewers!) don't consider this a real 
problem.

hope this helps,

Kay



On Mon, 19 May 2014 23:07:54 -0400, Thomas Cleveland 
<thomas.clevel...@gmail.com> wrote:

>Hi all,
>
>This is a basic question and I'm sure the answer is widely known, but
>I'm having trouble finding it.
>
>I'm working on my first structure.  I have a dataset that I processed
>in XDS with a resolution cutoff of 2.35 A, although the data are
>extremely weak-to-nonexistent at that resolution limit.  After
>successful molecular replacement and initial refinement, I then
>performed "paired refinements" against this dataset cut to various
>resolutions (2.95 A, 2.85 A, 2.75 A, etc).  Based on the improvement
>in R/Rfree seen between successive pairs, it appears that the data
>should be cut at around 2.55 A.
>
>Here is my question: as I proceed with refinement (I'm currently using
>Phenix), should I now simply set "2.55 A" as the resolution limit in
>Phenix?  Or should I go back to XDS and actually reprocess the data
>with the new limit (2.55 A instead of 2.35 A)?
>
>Thanks,
>Tom

Reply via email to