Scientists, and many others, would love it if people would “trust the science” when faced with important public or personal decisions. However, only a tiny proportion of the population will have a deep knowledge of research on any topic, so most people most of the time have to rely on what scientists say about it. We can talk all we like about what people should do when scientists disagree, as they frequently do, but maybe looking at how people actually decide which scientists to trust would be helpful as well.
This is what Dr Branden Johnson of Decision Research and colleagues are doing. Their results start the process of revealing what might work when scientists and science communicators want the reality of a scientific debate to cut through the misinformation.
Johnson and colleagues took three live scientific debates and presented statements to 3,150 Americans to test what influenced them on which side to favor. Of these, two – the sea level rise from climate change and the effects of cannabis – are hot button political concerns linked to culture wars. The third, the composition of dark matter, is relatively free from enrolment in such conflicts, although Creationists occasionally try.
“Our goal here is to explore which cues people use to determine which side in a dispute among many scientists to believe,” the study’s authors write. People may come to some debates – the effects of cannabis being a particularly clear example – with a preexisting lean, but they’ll also be influenced by the way the scientists present their case and what they’re told about those scientists.
Participants in the study were presented with a few sentences of background about what scientists agree and disagree on in relation to the debate and were asked a series of questions. Some of these concerned how much they knew about the topic and cared about who was right, while others related to their views on science more broadly.
They were then shown a series of cues that might influence their position, such as: “About 75% of the scientists with expertise on this topic support Position A”; “The information supporting position B has on average been collected more recently with newer techniques,”; or “Scientists who support Position B average 7.5 more years of experience in doing this kind of research.”
After being shown several of these cues, participants were asked which side’s position they thought was right, and repeats of the initial questions. Further cues were then shown, and the questions asked again, allowing the researchers to test the relative power of different cues.
Not surprisingly, cues on where most scientists were aligned, which side’s evidence came from more advanced techniques, and that independent teams had reached the same conclusions were all quite persuasive. Having the more experienced scientists also counted for quite a bit, even though some might see this as evidence of being behind the times. These results also matched those of a previous study by Johnson on the same topic with a different design.
On the other hand, participants were less swayed by learning a position was favored by scientists who graduated from more prestigious universities, or whose employer stood to gain from one position winning.
None of the cues had much effect on whether participants cared about who was right. Some did, however, increase mistrust for science, showing they need to be used with care.
Not surprisingly, cues were more influential in relation to dark matter – where participants were unlikely to have pre-existing positions – than on cannabis, with sea levels in between. More unexpectedly, factors like prior engagement with the topic and knowledge of science had no discernable impact on how influenced people were by the cues.
There’s a lot of anxiety about the sight of experts disagreeing in public and how that will affect those outside the field. “[T]he dispute may threaten belief in the value of science overall rather than just in the value of the specific science being disputed. A dispute may be even more threatening when among large groups of scientists on each side, rather than (say) one individual scientist versus another,” Johnson and colleagues write. They add, however, “Concealing disputes could be equally problematic.” Consequently, there has been considerable investigation of whether witnessing scientific disagreements makes people lose confidence in science.
The authors note that in contrast, the question of how people make up their minds between competing groups of scientists hasn’t been much studied: “More attention has been devoted to disputes between scientists on one side and nonscientists on the other,” they write.
The study is published in Risk Analysis.
Source Link: How The Public Decides When Scientists Disagree