Site icon Medical Market Report

“The Illusion Of Information Adequacy”: Why People Think They’re Right Without All The Facts

How we respond to arguments or opinions that differ from our own can have serious implications, whether it’s a tiff between a friend or a family member, or a stubborn dispute with a colleague at work. We’ve probably all had instances where everyone involved thinks they’re right and things just don’t go anywhere. According to a new psychology study, this situation can be caused by a newly coined bias known as “the illusion of information adequacy”, where people tend to assume they have all the information they need to take a position or make an argument, even when they don’t.

When it comes to understanding the reality of things, humans are pretty rubbish. We have various biases that blur the lines between our assumptions about how we think things are and how they really are.

Advertisement

For instance, psychologists know that people have a default belief that their own personal, subjective views represent an objective understanding of reality and that these views make up the consensus. This phenomenon, known as naïve realism, can cause all sorts of issues when we try to navigate differing perspectives. If people disagree with us on a topic – such as abortion rights, the Israeli-Palestinian relationship, or climate change – it becomes easy to dismiss them as either relying on the wrong information, being unwilling or incapable of thinking logically, or just giving into their own biases.

In short, our brains make it difficult for us to view other people’s views as correct, but naïve realism is not the only challenge to overcome. Researchers from Ohio State, Stanford University, and Johns Hopkins University have now posited an additional bias that they call the “illusion of information adequacy”. This new bias makes people assume they have enough information to understand a situation and make the right decisions even though they often have no way of knowing what they don’t know.

“From Socrates to Rumsfeld, people often acknowledge that there is much that they do not know, including a meta-awareness of ‘unknown unknowns’,” the researchers explain in their study. “We argue that another default setting – comparable to naïve realists’ assumptions that they see objective reality – is that people fail to account for the unknown unknowns.”

This failure results in people navigating their social worlds with confidence assuming they have all the information they need, forming opinions, and reinforcing values and behaviors without questioning how much they don’t know.

Advertisement

“For example,” the team explain, “many drivers have pulled up behind a first car at a stop sign only to get annoyed when that car fails to proceed when traffic lulls at the intersection. Drivers of these second cars may assume they possess ample information to justify honking. Yet, as soon as a mother pushing her stroller across the intersection emerges from beyond their field of vision, it becomes clear that they lacked crucial information which the first driver possessed.”

In this case, the second driver acts on the assumption that they have sufficient knowledge to justify honking their horn at the other car, but they were wrong.

This may seem like a trivial example, but it epitomizes a phenomenon that can have implications for more serious situations concerning political debates or other personal relationships.

To demonstrate this specific bias, and how it differs from naïve realism, the team surveyed 1,261 Americans through the online platform, Prolific. The participants read an article about a water shortage at a fictional school. One group read an article that offered arguments for why the school should merge with another school, while a second group read an article that only discussed why the school should stay separate and hope for solutions to the issue. Then the control group read all of the arguments for the school either merging or staying where it was.

Advertisement

The team found that most people in groups one and two – pro-merging and pro-separate – believed they had sufficient information to make a decision about the school’s future. In contrast, only around 55 percent of the control group believed the school should merge. Those who had half the information also had more confidence that other people would make the same recommendations as them.

“[T]his study provides convergent evidence that people presume that they possess adequate information – even when they lack half the relevant information or be missing an important point of view. Furthermore, they assume a moderately high level of competence to make a fair, careful evaluation of the information in reaching their decisions,” the team explained.

Interestingly, the research also showed that some participants were willing to change their recommendations once they were aware of the other half of the argument. Once these people were given the rest of the arguments, the results were comparable to those expressed by the control group, with around 55 percent favoring merging and 45 percent staying.

“Contrary to our expectations, although most of the treatment participants who ultimately read the second article and received the full array of information did stick to their original recommendation, the overall final recommendations from those groups became indistinguishable from the control group.”

Advertisement

The results show that sharing a pool of information may lead to greater agreement. It also shows that the illusion of information adequacy can be overcome by a certain level of self-awareness.

“Although people may not know what they do not know, perhaps there is wisdom in assuming that some relevant information is missing”, the team conclude. “In a world of prodigious polarization and dubious information, this humility – and corresponding curiosity about what information is lacking – may help us better take the perspective of others before we pass judgment on them.”

The paper is published in PLOS ONE.

Source Link: "The Illusion Of Information Adequacy": Why People Think They're Right Without All The Facts

Exit mobile version