On Tuesday 28 September 2010 10:27:17 am Francis E Reyes wrote: > Hi all > > I'm interested in the scenario where crystals were screened at home > and gave lousy (say < 8-10A) but when illuminated with synchrotron > radiation gave reasonable diffraction ( > 3A) ? Why the discrepancy?
Such a happy outcome is very rare. I normally expect the high intensity at a beamline to push the line separating "weak but visible" and "too weak to see" towards higher resolution usable data. But that isn't a change in the diffracting properties of the crystal, just a change in achievable signal-to-noise. Two possibilities come to mind. A) Unintentional annealing during transport/or and as a consequence of crystal handling after shipment. We've certainly seen this, but usually the result is bad ice rings rather than better diffraction. B) Home screening used a relatively large beam that saw the entirely of a highly imperfect crystal. The synchrotron beam used a much smaller aperture that happened to illuminate only a sweet spot. This is one argument advanced for the use of micro-focus beams. B') An extreme case of B. Multiple crystals in the loop. Home screening caught a bad one; beamline screening caught a good one. -- Ethan A Merritt Biomolecular Structure Center, K-428 Health Sciences Bldg University of Washington, Seattle 98195-7742