junk DNA

Jay Mone jaymone at paonline.com
Sat Apr 15 08:28:21 EST 2000


Andrew,

You wrote:
It doesn't work that way.  The error rate is a probability, with
>something like 1 error in 100,000 (or so) for eucaryotic DNA.
>Doubling the amount of DNA also doubles the amount of errors.

But it should work this way.  As a example, consider your windshield in a
light rain.  Draw a 1 cm diameter circle on the windshield, and nest it in a
10 cm diameter circle.   Which circle gets more hits (mutations)?
I agree that the more DNA you have, the more mutations you'll accumulate.
However, if the mutations are in non-coding regions, who cares?

The converse should also be considered.  Is there a selective
>advantage to get rid of junk DNA?

If a selective advantage to remove junk DNA existed, then we would have
gotten rid of junk DNA a long time ago.

Viruses are a little different.  Here you see the most efficient utilization
of genomes as possible.  Consider hepatitis B, which has a genome of only
3200 bp.  Every base is part of at least one open reading frame or a
regulatory region of an orf.  Furthermore, in this and many other viruses,
you see overlapping orf s and other strategies to increase the amount of
info that can be stored.  The selective pressure against junk DNA in viruses
has nothing to do with energy needed to copy the DNA, since its the host
cell which does this anyway.  Rather, the problem is in packaging the genome
into geometrically precise capsids that  represents the selective pressure
against genome size.


Jay M.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://iubio.bio.indiana.edu/bionet/mm/cellbiol/attachments/20000415/25fe45ff/attachment.html


More information about the Cellbiol mailing list