> Perhaps there are things from computational physics that could help or
> things from computational biology that could help computational
> physics.
Absolutely! Many times I am sure certain mathematical theorums are
proven for one application, and could very well apply to another
application if only the proper people knew about it. Let me tell you
some of the stuff we do in our department, in the hope that it
triggers something in your mind that may be applicable.
In the Cognitive and Neuroscience department (CNS) of Boston
University we explore theoretical models of natural (neural)
computational mechanisms. Although there are others who make detailed
models of the incredibly complex dynamics within a single neuron, our
emphasis is at a higher, more symbolic level, where we are concerned
less with neural spiking dynamics and intracellular mechanisms, and
more with the signal processing and information representation issues.
In the same way, electrical engineers are concerned with voltages and
time delays while computer specialists consider logic gates and
boolean operations, where all values are 0 or 1.
Our models however are not computer-like boolean mechanisms (as are
common in more traditional "AI" {artificial intelligence} models), but
rather, the fundamental unit of our model is a dynamic "node", which
may represent either the spiking frequency of a single neuron, the
activity of a block of neural tissue, or even just an abstract
perceptual "notion" that is active in the brain. The activity of this
node is represented by a time-varying quantity x, as defined by a
differential equation, such as...
dx/dt = -Ax + (B-x)E - xI
DECAY EXCITATORY INHIBITORY
where A is the passive decay (goes slowly to zero without input), E
and I are the excitatory and inhibitory inputs (usually a sum of many
inputs from neighboring nodes) and (B-x) and x are the "shunting"
terms that hold the x value within the bounds between 0 and B (usually
B=1) like an analog op-amp that saturates with large inputs. The
excitatory and inhibitory connections between such nodes transfer
signal from one to another through (theoretical) "synapses" whose
conductance can vary slowly over time, depending on the average pre-
and post-synaptic activity of the nodes that they connect (there are a
variety of "learning" rules).
Inspired by neurophysiology, our models are parallel analog mechanisms
with multiple criss-crossed connections and feedback loops, which
makes for extremely complex dynamics that are difficult to analyze
mathematically. Shown below is a typical circuit, with two fields of
nodes, F1 and F2, input lines to F1, and interconnections (axons &
synapses) between F1 and F2. (Typically such interconnections go from
each node in F1 to every node in F2, and often have reciprocal
connections from F2 back to F1). The pattern of synaptic "weights"
(conductances) determines the transform between the pattern of
activations in F1 and F2.
F2 ( ) ( ) ( ) ( )
|\ /|\ /|\ /|
| \ / | \ / | \ / |
| \ / | \ / | \ / |
| / | / | / |
| / \ | / \ | / \ |
| / \ | / \ | / \ |
|/ \|/ \|/ \|
F1 ( ) ( ) ( ) ( )
^ ^ ^ ^
| | | |
INPUTS
For example, each node in F2 might respond to a particular pattern of
activations in F1 by virtue of that pattern being represented in the
F1 -> F2 synaptic weights, so that presentation of that pattern at the
inputs lights up the corresponding node in F2, thus performing a
"recognition" operation at F2. Competition within F2 (through lateral
inhibitory connections, not shown) would effect a "choice" between
competing interpretations of ambiguous patterns, and feedback
connections would complete missing elements or remove extraneous
elements of the pattern at F1 to make it conform more closely to the
patterns stored in the "memory" of the synaptic weights.
Mathematically speaking, this dynamic system, given initial conditions
(with the inputs on) will tend to settle into stable states (points in
state-space) where each stable point represents a "memory". A lot of
our mathematical analysis is devoted to ensuring that such systems are
stable (will not search for ever) and examine their capacity to store
patterns, learn patterns, generalize by clustering similar patterns to
the same stable state, stabilize existing memories while maintaining
the plasticity to learn new memories, and so forth. We also use such
models to simulate specific neurocomputational mechanisms in vision,
motor control, and conditioning, which can be tested against
psychophysical data. Such models are thus perceptual or behavioral
models, rather than neurological models, since they simulate and
reproduce perceptual and behavioral phenomena rather than neural
responses.
So, does this kind of model, a parallel analog dynamic system, bring
to mind similar dynamics in other physical systems, and are there any
interesting mathematical analyses of such systems that might pertain
to our kind of models?
--
(O)((O))(((O)))((((O))))(((((O)))))(((((O)))))((((O))))(((O)))((O))(O)
(O)((O))((( slehar at park.bu.edu )))((O))(O)
(O)((O))((( Steve Lehar Boston University Boston MA )))((O))(O)
(O)((O))((( (617) 424-7035 (H) (617) 353-6741 (W) )))((O))(O)
(O)((O))(((O)))((((O))))(((((O)))))(((((O)))))((((O))))(((O)))((O))(O)
--
--- Moderator ---
Domain: curtiss at umiacs.umd.edu Phillip Curtiss
UUCP: uunet!mimsy!curtiss UMIACS - Univ. of Maryland
Phone: +1-301-405-6710 College Park, Md 20742