Yuri,

Detwinning relies on having both twin-related reflections present to calculate either/both of the the de-twinned data values. Therefore it magnifies incompleteness depending on where your missing data is with respect to the twin operator.

I'd recommend against trying to do this with a twin fraction close to 0.5. From the DETWIN docs:

Itrue(h1) = ((1-tf)*iTw(h1) -tf*iTw(h2)) / (1-2tf)

i.e. tf = twin fraction, so 1/(1-2tf) becomes a large number and it's multiplying a weighted term of the form: (iTw(h1) - iTw(h2)) which becomes a very small number as the twin fraction approaches 0.5. The latter difference can easily be less than sigma(I), and so the signal/noise of your data plummets.

Better to use REFMAC and phenix.refine's abilities to compensate for the twin fraction directly in refinement and leave your data as it is.

Phil Jeffrey
Princeton


On 9/29/11 10:03 AM, Yuri Pompeu wrote:
After I ran DETWIN with the estimated 0.46 alpha, my completeness for the 
detwinned data is now down to 54%!!!
Is this normal behavior? (I am guessing yes since the lower symmetry untwinned 
dat is P1 21 1)

Reply via email to