Re

Liar42 liar42 at aol.com
Tue Aug 28 05:54:41 EST 2001


> (...)
 
Before developing a neural network model of mismatch reduction a choice must be
made between feedforward mismatch reduction (fmmr) and recurrent mismatch
reduction (neural servoing).<

Another might not find networks generalized so relevant but rather opinions of
own I, if someone is a lot of into optical hallucinations or usually not, if a
lot of hallu-drugs were used lately, various aspects with completeness and
health of internal areas, and so on.

> With fmmr the network receives the actual sensory input and is the output
matched with the same sensory input.<

What makes you so sure that an area has to have the same output as input?

I might get an input and just not find it relevant, therefore if my magic
systems make a decision that is not to do with that, my hardware I would assume
to be also busy with processing what I find relevant.

Also with various sensory input concering stuff to do with outer data I would
not assume that all systems just make a matched output.

So, there are a bunch of leaves on a tree.

So what, they might be utterly irrelevant when I talk to someone, and maybe I
do not even pay attention to if there is a tree at all, and I assume the
sequencer has no particular interest in its leaves and a bunch of other stuff
there, either, and it might just be navigated that we don't bang into the tree
in case it is on the way, but if it is on the side, so what.

A lot of sensory data input by various systems I assume is not that relevant
for their output.

> With neural servoing<

Meaning what?

> the network <

As usual undiscerned.

>receives the mismatch between the sensory input and the network's output.<

Why should all networks receive that, that is utterly irrelevant data for a lot
of stuff.

It sounds to me more like you are off into illusions again.

If I am talking to someone, what there are all for trees to the side and leaf
differences and so on is utterly irrelevant data for our conversation, unless
we are discussing exactly that.

A lot of output is not to do with a bunch of input.

There does not sit a little computer that compares my input and my output and
any time I am not interested in what leaves are on a tree though I noticed
there are some but make other output makes a big systems altert to all hardware
networks, as if it was relevant for any motorical system, any emotional system,
the language structurer and a bunch of other internal stuff that there are
leaves on a tree. 


If there is actually a hallu in the wake of some hallu drug there might be
information along that this data is not proper, and own I might then decide,
depending also on knowledge and skills there, to either counter-steer out of
hallu settings or deeper in and make it more interesting.

Such can be very much an own I decision, and not of own I s networks and not of
all other systems nor just their networks.


>(...) better predictions or synergetic behavior reduce the network's input,
since the mismatch is the input. <

If a prediction is correct, there should be no mismatch in the input, apart
from that a lot of people might not waste that much time with predictions.

Why should I waste much time to try to predict what you will write next, if I
can as well read it?

The prediction would be an efficiency waste for me.


>However, if a network is successful, this relieves the neurons close to the
input of the continuous processing of known phenomena.<

What network is successful in what exactly?

Network vagueness crap and ignoring a bunch of cells and the magic systems
including own I in the brain aside, if my own I made a prediction, say that in
a city there will be houses, or that someone will be home when I go there to
visit, that as such does not relieve optical systms from data to do with houses
or that person.

> All input requires adjustment of the network since input signals novelty. For
this reason the neural servo network has been numerically explored. <

How about for a change you are talking about what your servo is supposed to be.

Apart from that it is known that there can be different cell and axon numbers
in various areas in various individuals and that there is not THE number for
that, and this has certainly not been "explored" in 6 billion humans in order
to even just make an average that is the correct average for all of them.

The more I read you stuff the more it sounds like a waste of time.

> Besides the conventional dimension
 'weight'<
?

> of a network connection, <

(...)

> Those connections whose age dropped to zero were erased. <

You are plain nuts.

(If a connection had no age it would not be there and therefore could als not
be erased.)

This is article is too stupid to read on for me.

The next text about glia functions in the brain might be more interesting.





More information about the Neur-sci mailing list