Question on Analysis of Brainwaves

Matt Jones jonesmat at ohsu.edu
Tue Sep 30 12:37:20 EST 1997


In article <34301f8b.5419368 at news.tiac.net> Steven, Bagelboy at tiac.net
writes:
>I'm sorry to bother everyone, but I'm a New York City High School
>student researching the possibility of integrating your mind with a
>computer in the future so that your mind controls the computer.

Hi Steven,

This is not a bother, as long as you try to remain scientific and
objective in your research project and don't get too carried away with
new age sci-fi non-science. 

I don't think the idea is too farfetched for something in the far future,
either. 

Right now, to the best of my knowledge, "brainwaves" (EEG signals and
Event Related Potentials) are only very poorly understood. By this, I
mean that they are a rather low resolution picture of the millions of
underlying very complicated microscopic neuronal signals, and don't
really help too much in telling us what's going on underneath where the
real thinking is taking place. Another method that's been used is called
Functional Magnetic Resonance Imaging (fMRI) which, as the name suggests,
results in images of brain activity. This is an amazing method, but is
still low resolution as far as the individual neuronal events are
concerned, and also as far as being able to translate "thinking" is
concrned. . So before we can use these methods to control machines, we
need to know what language they're speaking, and figure out how to
translate it into a set of commands to the machine. This is not going to
happen any time soon.

However, there are a lot of other ways of interfacing people with
machines (aside from the really obvious ones, like computer keyboards). I
remember that even in the late seventies, IBM or AT&T were using the
electrical signals from the muscles around the eye socket to guide a
mouse around a computer screen. Wherever you look, the mouse follows. I'm
guessing that this technology is being used by fighter pilots now.
Similarly, in modern "Virtual Reality" technology, signals from the
position of your fingers can be translated into events happening in the
virtual world. So although true interfacing of minds and machines is a
long, long way off, people are working on simpler interfaces that they
understand better, and are having a great deal of success.

A good starting point for a web search on this topic would be the Media
Lab at MIT (they build robots and stuff, pretty nifty). I'm sure you can
get to it by searching for "MIT" and "Media Lab". If no luck, email me
and I'll try to track it down for you.

Cheers,

Matt Jones



More information about the Neur-sci mailing list