In article <MPG.ed4ed5b4d7b67a398968a at news3.wt.net>, solntsev at wt.net
says...
> I want to make a couple of points.
>> First on the subject of neural nets: one should not try to extend the
> capabilities of such computing device to its use as a model of a brain
> simply on the basis both apply similar terminology. For it is exactly that
> which some researchers (and others) appear to be doing. A neural net is an
> interesting computing device that is both inartistically parallel and
............................................^intrinsically (autocorrected)
> distributed. Its operation is determined by multiple interconnected nodes,
> not unlike that of a brain. Yet it is not the only device that exhibits
> such properties. Actually, any system of multiple cooperating
> interconnected nodes could look like a brain. We should avoid making
> parallels between computing devices and brain unless we can reasonably show
> such devices as explaining more than one aspect of brains operation.
>> Second on the subject of linearity: most of us have became trapped in the
> Von Neumann world of sequential computation with only neural nets seen as
> the way out. Well, thank goodness, there are additional inherently parallel
> and distributed computing devices that could save the day. While there are
> several such devices, I will name the Dataflow computers specifically for I
> find them most usefull. Actually, a neural net could be considered a
> special case of a Dataflow computer. What is a Dataflow computer? It is a
> device composed on multiple cooperating interconnected nodes that represent
> a dataflow graph of the computation they perform. Each node can have
> multiple inputs and outputs connecting it to other nodes. Each node
> represents a computing function based on those inputs, and that function is
> executed when these inputs become available. All nodes exchange information
> using messages that could represent complex data structures.
>> Just a couple of points (for now as this message is getting too long :-))
>> Alex
>> All disclaimers apply
>>