I refrained from entering the fray during last month¹s discussion of anisotropic data in refinement, but I wonder if there is any consensus regarding treatment.
It seems to me that during refinement scaling to calcs should be superior to even the very elegant likelihood methods. Another problem exists in calculating a meaningful Patterson, particularly for molrep where throwing out reflections is unpopular. A third problem seems to exist in horrific cases where scaling software will not allow inclusion of the high res terms. Today¹s question seems to be about one of the latter 2 cases. It¹s unclear where Katja is stuck. For the refinement case, do others think that Fcalcs become anisotropic? Or that the liklihood method developed by Read and others is superior even at late stages? Thanks, Ben On 10/5/09 11:21 AM, "Pavel Afonine" <pafon...@lbl.gov> wrote: > Hi Katja, > > you may consider trying this: > > http://www.doe-mbi.ucla.edu/~sawaya/anisoscale/ > > but PLEASE do not deposit "corrected" data to PDB. > > Also, I would just try to refine the structure and see how it goes (see if you > really need to use the above tool). > > Pavel. > > On 10/5/09 8:21 AM, Katja Schleider wrote: >> >> >> Hi everybody, >> >> is there a way to improve crystals that diffract strongly anisotropic? We got >> data between 2.5 and 4.0 A and scala says we should cut these data at 3.9 A. >> It's such a... I want to solve this structure! >> >> >> greetings >> >> Katja >> >> >> >> __________________________________________________ >> Do You Yahoo!? >> Sie sind Spam leid? Yahoo! Mail verfügt über einen herausragenden Schutz >> gegen Massenmails. >> http://mail.yahoo.com >