We often associate all signs of forgetfulness with some sort of disorder or disease. But the usual signs of forgetfulness is normal. The common trait of being forgetful is even claimed to be helpful by some neuroscientists. They say that our brain tends to forget outdated information for creating better memory which is relevant and helpful for our survival.
Most of us think “perfect” memory means never forgetting, but maybe forgetting actually helps us navigate a world that is random and ever-changing.
So say two neuroscientists in a review published today in the journal Neuron. The argument is that memory isn’t supposed to act like a video recorder, but instead like a list of useful rules that help us make better decisions, says study co-author Blake Richards, a University of Toronto professor who studies the theoretical links between artificial intelligence and neuroscience. So it makes sense that our brains would make us forget outdated, irrelevant information that might confuse us, or information that leads us astray.
We have yet to find the limits of what the human brain can store, and there’s more than enough room, so to speak, for us to remember everything. Still, the brain actually spends energy making us forget, by generating new neurons that “overwrite” the old ones, or by weakening the connections between neurons. But why does it do so if our brains aren’t running out of space?
Firstly, forgetting old information can make us more efficient. Just think about all the times you’ve memorized the wrong name, and then later wished that you could remove that memory and stop confusing it with the right name.
Forgetting old information can also keep us from generalizing too much from one piece of information. Here, there are many parallels with artificial intelligence and how these systems learn, according to Richards. If you teach a computer to recognize faces by making it memorize thousands of them, all it will do is learn the particulars of all the specific faces. Then, when you expose it to a new face, the model won’t actually know it’s a face because it never learned the general rules. Instead of learning that faces are usually oval and have two eyes, a nose, and a mouth, it learned that some of these pictures have blue eyes and some of them have brown eyes and some have thicker lips and so on.
Human brains could run into this problem, too. Richards compared this to “Funes the Memorious,” a story by Jorge Luis Borges in which a man is cursed with perfect memory. Funes remembers in exquisite detail, but “doesn’t understand because everything he experiences is its own individual snapshot moment.” To fix this program, AI researchers use a technique called “regularization,” where they force the system to forget some of the details until they’re left with the core information they’re interested in: what is a face, what is a dog versus a cat, and so on.
The process of which, and how much, information the system should forget can be trial-and-error, both in humans and computers. Our brains tend to forget memories of things that happened (episodic memories) more quickly than general knowledge (semantic memories). In fact, episodic memories tend to fade fairly quickly anyway — knowing which shirt you wore six weeks ago is rarely helpful. Many different factors go into this: how novel the situation is, how much attention someone is paying, how much adrenaline is in the system. “The brain’s principle is to forget everything except those instances that were highly salient,” says Richards. Traumatic events like assault, for example, stick with us because the brain wants us to remember, and avoid, things that will help us survive.
Ultimately, says Richards, we often assume that memory is a good thing, but “at the end of the day, our brains only do things if it was good for our survival from an evolutionary perspective.” And in the case of memory, he adds, our brains probably have been shaped by evolution to only remember that stuff that is pertinent to our survival. So maybe not being able to remember how you know someone is a feature of our brains, not a bug.
A version of this article appears on The Verge. | Author: Angela Chan (@chengela)