Catastrophic forgetting is what happens when you train a neural net to do a
certain task, and then we you train it to do a new task, it is completely
unable to do the old task.
This can be overcome by training the net at the old task intermitantly
between trials and the new task.
Interestingly (and this is what I was getting at in my OP), when you do the
intermitant 'reminding' of the old task, so you get a net that can do the
two tasks, the nodal connection strengths are completely altered from the
original task. i.e. for a net to remember 2 things, the nodes arn't a
superposition of the patern for task 1 and task 2, they are completely
remodeled.