Computers Understanding Thought

John M Price ez001932 at rocky.ucdavis.edu
Sat Dec 3 18:06:18 EST 1994


I just can't ignore this.

I am familiar with Eric Suter's work, and setup.  It does work.  The 
electrodes are placed so as to be close to a17.  As a17 wraps itself 
around the calcarine fissure, this is not trivial, but, what the heck, it 
is a volume conducted response anyway.  I did hear of one person, a 
physician, who had electrodes palced on his dura.  Obvioulsy his consent 
was informed!

The patient looks at an array of flashing squares.  The more the better 
actually!  The EP recorded form the square that is foveated WILL be the 
greatest in amplitude, and the computer then does a matchup to the task 
or letter associated with that square.  Pretty neat, clean and effective.


Clark Dorman (dorman at acs.bu.edu) wrote:

: In article <3b4qec$7r5 at lyra.csx.cam.ac.uk> 92tad at eng.cam.ac.uk (T.A. Donaldson) writes:

: >   I have heard about an experiment in which a computer was able to pick up
: >and understand the THOUGHTS Yes or No, and also move a cursor around the
: >screen by picking up thoughts.
: >
: >   I understand that this was done by using SQUIDS (superconducting quantum
: >interference devices), and have found a lot of articles on the technical side
: >of setting up SQUIDS (see IEEE Transactions on Applied Superconductivity
: >1993), which suggest that this is possible.
: >
: >   However, I have not found any article in which the meaning of the neural
: >data collected was succesfully understood by computer.
: >
: >   Does anyone know of the research I am talking about and could tell me what
: >journal articles apply? Anyone got any related information?
: >
: >   Tom Donaldson 
: >

: In the Health/Science section of the Boston Globe, on Monday, August 16, 1993,
: there was an article about several researchers do work on this area.  Not
: exactly a peer-reviewed journal. 

: The systems have been used to do several things, including such publicity
: stunts as steering a boat.  Most of the systems use scalp electrodes rather
: than SQUIDs.

: People mentioned in the article:  

: 	Andrew Junker, from Yellow Springs, Ohio, engineer. (boat steering)

: 	Dr. Jonathan Wolpaw, neurologist, and Dennis McFarland, psychologist,
: 	  at NY State Health department (cursor moving)

: 	Grant McMillan, director of brain-actuated research program at Wright
: 	  Patterson AFB, Dayton, Ohio (missile selection, radar mode, etc.)

: 	Eric Sutter, scientist at Smith-Kettlewell Eye Research (disabled
: 	  computer interface)
: 	
: 	Dr. Emanuel Donchin, psychology prof at U. of Illinois (letter typing)

: You might try searching on the people above, especially Donchin.  

: Personally, I am _extremely_ skeptical that what is happening is based upon
: brain patterns.  Because they are using scalp electrodes in most cases (where
: the equipment is even mentioned), the noise from muscles can easily swamp the
: brain waves.  If you look at scalp recording in other fields, they usually
: have to average over many operations to determine the underlying brain
: activity.  Most of these people are probably fooling themselves into thinking
: that they are getting brain waves, when what is really happening is that the
: subjects are learning to contract various muscles in the scalp.  Just as you
: can learn to wiggle your ears by simply practicing in front of a mirror, you
: can learn to move your other muscles. 

: I think that the systems, even if based on muscle contractions, can be very
: useful for the disabled.  A combination system, based on visual tracking, with
: cues from scalp recording, could be more useful than the systems that are
: based purely on visual tracking now.  

: Even if the researchers _are_ getting brain measurements, don't expect any
: sort of decoding of the brain activity that will give any content.  Examine
: the PET, MRI, and CAT scanning literature.  The researchers in those fields
: are still working on determining where activity occurs, and in what order,
: when different tasks are being done.  

A LONG while back, this was done.  With about ten people, the machine 
could read words thought by these individuals.  It, too, is a simple 
matching task.  The pattern of brain wave activity was taught to the 
machine as representing a specific word for the individual.

This is, essentially, as simple as speech recognition.  Take yourself 
back to, say, 1940, and see how many people would have believed that a 
machine could hear a word, and act appropriately.  


: Clark Dorman
: Cognitive and Neural Systems
: Boston University

--
____________________________________________________________________________
John M. Price, Ph.D.     | Physiological Emphasis - Likes machines too!
Psychology Department    |            PGP Key by request.  
University of California |            Privacy IS freedom.
Davis, CA  95616         | Ask the Chancellor to speak for this place!



More information about the Neur-sci mailing list