DNA & Aging

Paul Boduch (ES 1997) pboduch at minerva.cis.yale.edu
Mon Nov 6 21:40:44 EST 1995


	O.K. This is a response to the first response to my post I got from 
Michael Gregory Abel. I thought I should post it here too for interested 
parties. This is the only way I could do it, posting it as an attachment.
I still have to figure out how to post to newsgroups while replying to 
people.
-------------- next part --------------
From pboduch at minerva.cis.yale.eduMon Nov  6 21:36:15 1995
Date: Mon, 6 Nov 1995 21:05:01 -0500 (EST)
From: "Paul Boduch (ES 1997)" <pboduch at minerva.cis.yale.edu>
To: "Michael Gregory Abel, University of Tennessee"
     <abel at UTKVX.UTCC.UTK.EDU>
Subject: Re: DNA & Aging

On Mon, 6 Nov 1995, Michael Gregory Abel, University of Tennessee wrote:

> By the way, some scientists believe that the human life-span will be increased 
> by as much as 100% in the near future but in the light of the current 
> population problem and limited resources, who would want to live in a 
> future of scarcity?

	I don't believe the future will be one of scarcity due to one reason:
progress towards building a universal constructor/deconstructor of matter has
been truly spectacular these past few years. Already scientists have 
succeeded in the capture and 3D manipulation of atoms using a single, highly
focused laser beam.  Laser construction techniques have also reached a point 
where 99% efficient, 0.5 (or less) micron-sized lasers can be built cheaply 
and in huge quantities. These new mini lasers can be made to produce the 
exact types of light needed for the capture and manipulation of any atom, 
or molecule. Together with binary optics and new, highly precise molecular 
imaging techniques, they make the world's first, fast universal constructor/
deconstructor of matter feasible for the first time. Physicists have 
even figured out a way to overcome Heisenberg's Uncertainty Principle, or
the quantuum distortions in data that may result from the initial 
scanning of the atoms of the object to be deconstructed although I believe
it would be far easier to correct these problems through software. I'll fill 
in the details for you later if you are interested. For now, I'll save space 
here and some of my time. Suffice it to say that such a machine would make
scarcity, environmental pollution, and health problems a thing of the past.

For a few References See: Scientific American
		    1) February, 1992, "Laser Trapping of Neutral Particles"
		    2) November 1991, "Microlasers" 
		    3) November 1990, "Diminishing Dimensions"
		    4) May, 1992, "Binary Optics" 
		    5) Discover Magazine, 1993. Physics. "Getting
                       there is half the fun." (way out of the Heisenberg
                       uncertainty principle)
	            6) Businessweek. Dec.6, 1993. "At&T Brings Molecules into
                       Sharper Focus." (A new near-field optical scanning
                       microscope allows researchers to generate a precise
                       map of individual molecules on the surface of a 
		       specimen."


> 	The third theory (which I believe is the case) is simply a 
> combination of the first two.  Both programmed cell death and oxidative 
> damage contribute to the timely demise of all organisms.

	I agree. I wanted to know simply if there was someone out there 
who thought along the same lines. The problem is then two-fold: 
environmental and genetic. The environemtal one is relatively easy. I
believe the "potion of youth" approach would work. Great strides have 
also been made in gene therapy. Besides, targeting should not be a 
problem since I had perodic replacement of the entire DNA in mind. I had a 
number of magazine articles describing the great strides in gene therapy 
,but unfortunately didn't save them. If you see any problems with this 
approach, please let me know.
	
	As for the genetic problem, here's my solution:
	1) First, I believe all current research in molecular biology 
	   is moving too slowly and spasmodically. By studying individual
           genes, proteins, and then spending years trying to figure out
	   the intermediate steps involved, researchers at best get
           incomplete snapshots of what happens inside the cell while
           missing most of the movie. Here's a way to speed up
           the research and make it more accurate: 
	2) The cell has to be reduced to its fundamentals, namely information.
	   I propose freezing a cell and deconstructing it layer by layer 
	   while saving the pertinent information about the composition 
	   of each layer and its layout [the types of atoms making it up
	   and their positions in space]. 
	   The process would work something like this:
	   
	   a) a "snapshot" of the layer is recorded using the near-field
              optical scaning microscope described above
           b) the layer is stripped using millions of precisely targeted
	      [here's where binary optics comes in] mini-lasers; thus the 
	      process should be fast
	   c) the entire cycle is repeated until the cell is deconstructed
	* Any distortions in the tertiary structure of molecules caused
          by deconstruction could later be corrected in the computer model
          by reconstrcuting the appropriate snapshots, and/or using the 
	  existing body of knowledge about cellular construction. Perhaps 
	  properly designed software could do this automatically.
        2) Once all the pertinent information is in the computer, a 
	   complete, true-to-life model of the cell can be constructed 
	   containing all the empirical data about the cell a researcher 
	   will ever need. By introducing the computer equivalent of 
	   temperature (some equation for molecular motion) and the right
	   simulated medium, the model (computer H2O with the right mix of 
	   nutirents) can be turned into a real-time emulation of the cell. 
	   Assuming all (or most of) the pre-programmed rules of molecular 
	   interaction are right, all cellular processes can be easily 
	   studied. The emulation can be speeded up to shed light on aging.
	   In this scenario, you not only have the code, but also the complete
           revelation of how it's interpreted by the cell, something
           that Project Human Genome does not promise at all. The emulated
           (or simulated depending on your views on A-life) cell would also
           shed new light on lots of details of specific types of 
	   molecular interactions. The need to expand/revise our understanding
           of molecular biology would become especially obvious if the 
	   emulated cell did not behave like a real cell in some respects. 

I don't believe computational problems pose a challenge anymore to a 
computer emulation of a single cell. With light computers on the horizon, 
processing speeds will soon be nearly infinite as will be memory storage.
For these reasons, it is incomprehesible to me why something like this 
has not been attempted ye at leat to my knowledge. I can't understand 
this bashful tendency to think small in science. Why would someone want 
to study pebbles on the seashore while ignoring a whole ea of knowledge 
that is finally accessible?





More information about the Ageing mailing list