[ad_1]
When we read and hear false “facts”, disinformation, conspiracy theories, and outright lies every day, distinguishing truth from fiction can be a difficult task. Why are we so susceptible to disinformation and so easy to believe? Perhaps we should blame our brains.
A Northwestern University study found that people tend to quickly download inaccurate statements into their memories because it’s easier than critically evaluating and analyzing what they actually hear. . According to lead author David Rapp: It’s a nightmare to critically evaluate all of it. We often assume that sources are trustworthy. It’s not that people are lazy, but it certainly contributes to the problem. It’s a computational task that weighs everything hard and hard because it tries to save resources for when you really need them. “Regardless of whether it’s accurate or not, information that is repeated over and over can lead to memory building, and “if we can get something easily, we tend to think it’s more true,” he says. says Mr.
Partly because of how perception is constructed and how misinformation can exploit it, no one can completely ignore falsehood. We use mental “shortcuts” to make many decisions. Most of what we are exposed to in our daily lives is obviously true. Gravity makes things fall down, leaves grow on trees, and I live in a house.
For this reason, according to Stephen Lewandowski, a cognitive psychologist at the University of Bristol, “By default, people will believe whatever they see or hear”. , and there are countless ways in which the claim can be disproved. You may hear from 10 people that the sky is a variety of colors, but from 500 people you will hear that the sky is blue. — so if you hear something repeated over and over, there’s a good chance it’s true.
But these shortcuts don’t always work well in today’s political environment and social media. The more often something is repeated, the more likely we are to believe it because it feels familiar and fluent. We are also more likely to believe misinformation that fits our worldview and self-identity. This is a process often referred to as “confirmation bias”. This is our tendency to seek out and prefer information that matches what we already believe.
Incorrect information appears to refuse to be corrected. Because multiple studies have shown that even after receiving corrected information and believing it to be true, it can affect our thinking. Hearing the truth does not remove the falsehood from our brain’s memory.
In a meta-analysis that aggregated the results of 32 studies involving more than 6,500 people, Nathan Walter, a professor of communication studies at Northwestern, found that correcting falsehoods reduces, but does not eliminate, the impact of misinformation. discovered. Brain imaging studies provide evidence that both deception and its modification coexist in the brain and compete to be remembered.
In an article for Psychology Today, Dr. Joe Pierre detailed some of the characteristics of what he called the “illusory truth effect.” ”
In other words, say something over and over and people will start believing it.
So how can we protect our brains from misinformation? Arizona State University researcher Madison Arnold suggests his seven ways to protect yourself from misinformation.
If you see a friend or family member sharing incorrect information, please correct it. But be kind when you do — don’t insult their intelligence. This is because by emphasizing the wrong information, the person is more likely to cognitively recall the very thing that was actually wrong.
[ad_2]
Source link