sigma_cutoff

Joe Krahn krahn at niehs.nih.gov
Fri Apr 7 18:13:03 EST 2000


Yu Wai Chen wrote:
> 
> For I/sigI = 1, your "signal" is only as high as your noise; i.e. it is
> not a signal statistically.  For I/sigI < 1, you are including pure
> noise.
That is wrong. You know that the observation is less that the noise
level
of the detector. Would you be willing to allow a reflection to refine to
a
large value even though you couldn't see any measurable diffraction??
A very weak reflection can do as much good as a strong reflection. If
you
want to throw out data, you might as well throw out average-valued
reflections.
This is why using any sigma cutoff is always bad for refinement. To know
if a
reflection is doing the refinement any good, you should not look at
I/sigma,
but whether (Iobs - Icalc)/sigma is > 1. If it is, then this reflection
will
help. But there's no use throwing it out because Iobs-Icalc can change,
and computers are fast now.

> Matthew Meyer wrote:
> >
> >     I've been re-refining some mutant structures.  This time I'm
> > throwing away peaks where I/sigI < 2.0.  R and Rfree are much improved
> > (by typically 4 %) over the previous refinements when sigma_cut = 0.0.
Any time you throw out weak values, the working and free Rs will both go
down a little, even with no refinement (try it), but that doesn't mean
it did you any good. If you do use sigma cutoffs, they really shouldn't
be applied to Rfree observations.

> > However, after expansion to full resolution, far fewer waters are picked
> > before further picking pushes R back up.
That's because it's easier to overfit with less data.
You should set the sigma cutoff to 0 if you want a good structure.
If you just want to lower Rwork & Rfree, set the sigma cutoff to about
10 or so.

Joe Krahn




More information about the X-plor mailing list