I recently spent an evening at an event with the legendary comedian Jim Carrey. Carrey has starred in dozens of movies, but the one that always sticks with me is Liar Liar, where he played a man forced to always tell the truth. It was the perfect role, as Carrey’s antics were backstopped by the paradox of trying to determine whether or not it is a good thing to always be truthful. I loved every bit of the movie, but it also left me with a dull headache and a sense of mental exhaustion.
Ironically, the feeling is not much different than spending time with a liar. Whether a pathological liar in your family, a crazy person peddling conspiracy theories on the subway, or a friend of a friend who shares untrue posts on social media, the result is the same: a dull headache and a sense of mental exhaustion.
Separating truth from lies is mentally taxing, and the way our brains do it is different than you might expect. 27 years ago, Harvard psychologist Daniel Gilbert wrote an article in American Psychologist showing that we treat information the same way a court treats a suspect: innocent until proven guilty. This means that the brain starts by assuming information is true and then seeks to confirm or deny that truth.
Interestingly, we didn’t always know how the brain handled truth and lies. The seventeenth century French philosopher Rene Descartes assumed the process was more neutral. He guessed that after hearing a statement, you take a second to understand the meaning, and then either accept it as true or reject it as false. But then the Dutch philosopher Baruch Spinoza came along and suggested an alternate method, which Gilbert found evidence to support 200 years later.
It may seem like an insignificant difference, something for philosophers to argue over. But the implications are massive, especially in the era of #fakenews, aggressive marketing, and general untruthfulness.
If Descartes was right, we wouldn’t have a bias toward believing things and the world might not be such a trusting place. But what Gilbert demonstrated is that if the brain is overloaded, it will accept lies as truth. The reason is that when the brain becomes taxed, it essentially shuts down. So if we start by assuming something as true and the brain then becomes overloaded, there is little hope in changing course. As Gilbert explained explains it, “when resource-depleted persons are exposed to… propositions they would normally disbelieve, their ability to reject those propositions is markedly reduced.”
Resource depletion could be as simple as having your attention divided between two tasks. Or it could come from stress, lack of sleep, or too much information. What’s more, there’s evidence to suggest that merely hearing too many lies can both overload the brain and reinforce a lie. Just as a suspect can be mentally overloaded into a confession, the brain can become overloaded with information and simply give up on trying to determine right from wrong. With time and stress, the truth becomes blurry.
Compiled by Olalekan Adeleye
USA Today