Multivalued and Continuous Perceptrons (Preprint)

George M. Georgiou georgiou at silicon.csci.csusb.edu
Sun Feb 21 21:26:18 EST 1993


Rosenblatt's Percepceptron Theorem guaranties us that a linearly
separable function (R^n --> {0,1}) can be learned in finite time.  

Question: Is it possible to guarantee learning of a continuous-valued
          function (R^n --> (0,1)) which can be represented on a
          perceptron in finite time? 

This paper answers this question (and other ones too) in the
affirmative:

	      The Multivalued and Continuous Perceptrons
				  by
			  George M. Georgiou

  Rosenblatt's perceptron is extended to (1) a multivalued
  perceptron and (2) to a continuous-valued perceptron.  It shown that
  any function that can be represented by the multivalued perceptron
  can be learned in a finite number of steps, and any function that
  can be represented by the continuous perceptron can be learned with
  arbitrary accuracy in a finite number of steps.  The whole apparatus
  is defined in the complex domain. With these perceptrons
  learnability is extended to more complicated functions than the
  usual linearly separable ones. The complex domain promises to
  be a fertile ground for neural networks research.


A postscript version of the paper (compressed, uuencoded, ~75k) can be
obtained by e-mailing me.

Comments and questions on the proofs are welcome.

--George
----------------------------------------------------------------------
Dr. George M. Georgiou                    E-mail: georgiou at wiley.csusb.edu
Computer Science Department                  TEL: (909) 880-5332
California State University	             FAX: (909) 880-7004
5500 University Pkwy
San Bernardino, CA 92407, USA



More information about the Neur-sci mailing list