On 3/31/2011 12:52 PM, Jacob Keller wrote:
The only advantage of a large, positive, number is that it would create
bugs that are more subtle.
Although most of the users on this BB probably know more about the
software coding, I am surprised that bugs--even subtle ones--would be
introduced by residues flagged with 0 occupancy and b-factor = 500.
Can you elaborate/enumerate?
The principle problems with defining a particular value of the B factor
as magical have already been listed in this thread. B factors are usually
restrained to the values of the atoms their atom is bonded to and
sometimes to other atoms they pack against. You may set the B factor
equal to 500.00 but it will not stick. At worst its presence will pull
up the B factors of nearby atoms that do have density.
In addition, the only refinement program I know of that takes occupancy
into account when restraining bond lengths and angles is CNX. The presence
of atoms with occ=0 will affect the location of atoms they share geometry
restraints with.
Of course you could modify the refinement programs, and every other
program that reads crystallographic models, to deal with your redefinition
of _atom_site.B_iso_or_equiv. In fact you would have to, just as you would
have to when you change the definition of any of the parameters in our
models. If we have to modify code, why not create a solution that is
explicit, clear, and as consistent with previous practices as possible?
I think that the worst that could happen is that the unexperienced yet
b-factor-savvy user would be astonished by the huge b-factors, even if
he did not realize they were flags. At best, being surprised at the
precise number 500, he would look into the pdb file and see occupancy
= zero, google it, and learn something new about crystallography.
How about positive difference map peaks on neighboring atoms? How
about values for B factors that don't relate to the mean square motion
of the atom, despite that being the direct definition of the B factor?
The concept of an "unexperienced yet b-factor-savvy user" is amusing.
I'm not b-factor-savvy. Atomic displacement values are easy, but I'm
learning new subtleties about B factors all the time.
The fundamental problem with your solution is that you are trying to
cram two pieces of information into a single number. Such density always
causes problems. Each concept needs its own value.
What two pieces of information into what single number? Occupancy = 0
tells you that the atom cannot be modelled, and B=500 is merely a flag
for same, and always goes with occ=0. What is so dense? On the
contrary, I think the info is redundant if anything...
To be honest I had forgotten that you were proposing that the occupancy
be set to zero at the same time. Besides putting two pieces of information
in the B factor column (The B factor's value and a flag for "imaginaryness".)
You do the same for occupancy (the occupancy's value and a flag for
"imaginaryness".) This violates another rule of data structures - that
each concept be stored in one, and only one, place. How do you interpret
an atom with an occupancy of zero but a B factor of 250? How about an
atom with a B factor of 500.00 and an occupancy of 1.00? Now we have the
confusing situation that the B factor can only be interpreted in the
context of the the value of the occupancy and vice versa. Database-savvy
people (and I'm not one of them either) are not going to like this.
If you want to calculate the average B factor for a model, certainly
those atoms with their B factor = 500 should not be included. However,
I gather we do need to include those equal to 500 if their occupancy is
not equal to 0.0. This is a mess. In a database application we can't
simply SELECT the row with the B factors and average them. We have to
SELECT both the B factor and occupancy rows and perform some really
weird "if" statements element by element - just to calculate an average!
What should be a simple task becomes very complex. Will a graduate
student code the calculation correctly? Probably not. They will likely
not recall all the complicated interpretations of special values your
convention would require.
Now consider this. Refinement is running along and the occupancy for
an atom happens to overshoot and, in the middle of refinement, assumes
a value of 0.00. There is positive difference density the next cycle.
(I did say that it overshot.) Should the refinement program interpret
that Occ=0.00 to mean that the atom is imaginary and should not be
considered as part of the crystallographic model? Wouldn't it be bad
if the atom suddenly disappeared because of a fluctuation? Or should
the refinement program use one definition of "occupancy" during
refinement, but write a PDB file occupancy that has a different definition?
(It might be relevant to this line of thought to recall that the TNT
refinement package writes each intermediate coordinate file to disk and
reads it back in at the start of the next cycle. There can be no
difference between the meaning of the parameters in memory and on disk.)
A great deal of what we do with our models depends on the details of
the definitions of these parameters. Adding extra meanings and special
cases causes all sorts of problems at all levels of use.
either. You can't out-think someone who's not paying attention. At
some point you have to assume that people being paid to perform research
will learn the basics of the data they are using, even if you know that
assumption is not 100% true.
Well, the problem is not *should* but *do*. Should we print bilingual
danger signs in the US? Shouldn't we assume that people know English?
But there is danger, and we care about sparing lives. Here too, if we
care about the truth being abused or missed, it seems we should go out
of our way.
I've not advocated doing nothing. I've advocated that the solution
we choose should be clearly defined and that definition be consistent
with past definitions (as much as possible) and consistent with the
principles of data structures created by the people who study such things.
We *should* go out of our way to make a solution to this common problem.
The solution we choose should be one that actually solves the problem
and not simply creates more confusion.
Dale Tronrud
P.S. I just Googled "occupancy zero". The top hit is a letter from
Bernhard Rupp recommending that occupancies not be set to zero :-)