Science and Tech

They investigate why people think they are right, even when they are wrong.

Oct. 10 () –

A new work indicates that People naturally assume that they have all the information they need to make a decision or support their position.even when they don’t have it. That is to say that They assume they are right, even if they are wrong.. The researchers called it the “illusion of information adequacy“This is mentioned in the work published in the journal PLOS ONE by Angus Fletcher, professor of English at Ohio State University (United States) and member of the university’s Narrative Project

Fletcher completed the work with co-authors Hunter Gehlbach, an educational psychologist at the Johns Hopkins University School of Education, and Carly Robinson, a senior researcher at Stanford University School of Education, all of whom are also in the United States.

“We found that, in general, people don’t stop to think about whether there might be more information that would help them make a more informed decision. If you give people some facts that seem to match, most will say ‘that sounds right’ and accept it.“Fletcher explains.

To reach this conclusion, the study involved 1,261 Americans who participated online. The students were divided into three groups who read an article about a fictional school that lacked enough water. One group read an article that only gave reasons why the school should merge with another that had enough water; a second group’s article only gave reasons to remain separate and wait for other solutions; and the third control group read all the arguments in favor of merging the schools and staying separate.

The results showed that the two groups that read only half of the story (either just the arguments for or against the merger) still believed they had enough information to make a good decision, Fletcher said. Most of them said they would follow the recommendations in the article they read. “Those who only had half the information were actually more certain about their decision to merge or stay separate than those who had the full story.Fletcher notes. “They were pretty sure their decision was the right one, even though they didn’t have all the information.” Additionally, participants who had half the information said they thought most other people would make the same decision. than them.

Despite this, the study provides good news. Some of the participants who had read only one side of the story later read the other side’s arguments. And many of those participants were willing to change their minds about their decision once they had all the facts. However, this does not always work, especially on deep-rooted ideological issues, qualifies. In such cases, people may not trust the new information or may try to reformulate it to fit their preexisting opinions.

“But most interpersonal conflicts have nothing to do with ideology. “They are simply misunderstandings that arise in the course of everyday life.”says Fletcher. These findings complement research on so-called naïve realism, the belief people have that their subjective understanding of a situation is the objective truth, Fletcher explains. Research on naïve realism often focuses on how people have different interpretations of the same situation. But the information adequacy illusion shows that people can share the same understanding, if they both have enough information.

Fletcher, who studies how people are influenced by the power of stories, says people should make sure they have the full story about a situation before taking a position or decision. ““As we discovered in this study, there is a default way in which people believe they know all the relevant facts, even if they don’t.”ends

Source link