TIL: catastrophic forgetting
The Computational Case for Hypocrisy is an interesting read.
neural networks cannot be adjusted selectively, because each of their weights carry information. if you train a neural net on cats vs dogs and then train it again on cars vs trucks, it’ll simply forget the knowledge of cats vs dogs.
the post says that the same thing probably occurs with the brain. we have an old brain, old relative to the exponential economic growth. it is difficult to change the ancestral limbic system so the solution is to add additional layers at the end that can monitor what the limbic system is trying to do.
maybe this is what mindfulness meditation is afterall. training the neocortex to be aware of what the limbic system is trying to do.