Cole Thomson thomson at
Thu Apr 23 16:29:55 EST 1998

I am refining a frozen data-set that denzo refined to a mosaicity of 1.4
degrees.  Even with the fairly high mosaicity, I was able to get a final
rmerge of 5%, and an average I/sig of 11 with complete data to 2.3A in
P21.  After several rounds of model building/refinement, my rfree seemed
stuck at 30.  I used Gerard Kleywegt's Dataman to check to see if I had
anisotropy.  I saw that average k's had about .5 LN lower intensity than
h and l.  So I went ahead and ran the standard xplor anisotropy
correction file and got the following correction matrix:

k=    2.988  B11=  21.094 B22=   9.853 B33=  21.044 B12=   0.000 B13=
6 .637 B23= 0.000

I then ran regular temperature refinement, and saw my rfree drop to
28.5.  I was pretty happy until I
realized that my average bfactor had also gone up from 37 to 53.  I was
a little worried about the value
being 37, but 53 really scares me.  I took a quick look at the maps from
the new model and corrected
data set, and they do look good- better than they did with the previous
model.  Since then I have heard two different opinions from people in
the lab.  One group says that the overall bfactor is just too high and
the correction just didn't work properly.  The others say that you can't
fool rfree, and that bfactors are mostly just useful within the  context
of the model.  I am going to run a bunch of omit maps to see what

I'd be interested in knowing what people think about whether or not this
"corrected" model is valid, and if there are other ways to check it.


Cole T. Thomson
Medical Scientist Training Program
Albert Einstein College of Medicine
W: (718) 430-2743
Email: thomson at

-------------- next part --------------
An HTML attachment was scrubbed...

More information about the Xtal-log mailing list