
In April last year, the extremely popular The Joe Rogan Experience podcast hosted a high-profile debate between two guests. On the one side was Flint Dibble, a professional archaeologist at Cardiff University in the UK, and on the other, was Graham Hancock, a British writer. The stakes: whether or not there is evidence of a long lost, super advanced global civilization that was mysteriously destroyed by rising sea levels around 20,000 years ago.
Anyone familiar with this idea will likely know that, to the archaeological community, it is nothing short of pseudoarchaeology bordering on a conspiracy theory. However, over the years Hancock has popularised it through books, documentaries, and appearances on shows like Joe Rogan’s and his own Netflix show, Ancient Apocalypse. The whole idea has been rejected by the archaeological community for lacking evidence and for either distorting accepted interpretations or rejecting them as some sort of bias or cover-up.
Unfortunately, the idea is extremely popular online among a rising trend in anti-intellectualism across the internet, but Dibble’s debate shook things up. Over the course of four hours, the two discussed the subject and the archaeologist was able to significantly challenge these ideas.
“The only way to kind of challenge this is directly,” Dibble told IFLScience. “You have to go into the space where people are listening to him and show them the real evidence and explain it clearly and in a charismatic way and an entertaining way with full citations up there and everything.”
This was a risky adventure for Dibble as Hancock and Rogan have an established rapport from previous appearances, so it could have gone badly. But he managed to provide a compelling and thought-provoking position that undermined his opponent’s. In doing so, he shows that scientists can challenge misinformation, fake news, or conspiracy theories related to their work at a time when uncredible ideas or misconceptions are being increasingly proliferated online.
According to a recent UN survey, around 85 percent of people across the world are now worried about misinformation and its impacts on their fellow citizens. Within this milieu, scientific ideas, stories, or claims are often attacked, dismissed, or appropriated/distorted to fit some specific aim (e.g. Flat Earthers deny aspects of physics, anti-vaccine proponents misrepresent scientific data, climate change deniers cherry pick evidence, or conspiracy theorist claiming archaeologists are covering up the “truth”). This is largely because science occupies a powerful position of authority in the modern world, making it a valuable tool for influencing beliefs or persuading people.
So given this issue, do scientists have a responsibility to fight back?
I spoke to three currently working scientists, including Dibble, to find out how they feel about the role scientists play in this issue, and whether they believe there is a responsibility to address false claims related to their field.
The problem with social media and our beliefs
For Dr Jonathan Stea, a clinical psychologist and adjunct assistant professor at the University of Calgary, misinformation and the impacts of the closely related issue of pseudoscience are part of the professional terrain. The realm of well-being and mental health have always been subject to myths, misconceptions, or alternative ideas that are not derived from evidence-based practices. But things really changed in 2020, during the COVID-19 pandemic.
“I really noticed, you know, once I got to social media, that there was a lot of anti-psychiatry tropes, [a] kind of a repeated idea or phrase, [as well as] just a lot of misinformation about mental health that were really eye-opening,” Stea told IFLScience. “I think that got worse during the pandemic, and certainly right up until now.”
There’s a Canadian code of ethics for psychologists, and one of the principles is a responsibility to society… part of that ethical code, in my opinion, also involves… debunking mental health misinformation and pseudoscience. So, I believe it’s written into the fabric of our codes of ethics.
Dr Jonathan Stea
One contributing problem, Stea argued, was the role of major health influencers on social media but also major figureheads who also proliferate misinformation. For instance, in 2023, Elon Musk tweeted to hundreds of millions of people that depression is “overdiagnosed” and that SSRIs, the most common medication used to treat depression, are “zombifying” the public. Instead, people should take “ketamine”, which “is a better option”, he said.
Musk’s remarks came in response to a community note on X that pointed out that depression is a mental disorder “resulting from a complex interaction of social, psychological and biological factors” after Andrew Tate said depression wasn’t real, it’s “a choice”.
“And so these guys, I wouldn’t necessarily call them part of the anti-psychiatry movement per se, but what I think they’re doing is parroting these tropes that are so pervasive and imbued in our culture that they’re not even aware that these tropes are happening,” Stea explained.
“I think that’s a very dangerous thing, because when people are repeatedly hearing such tropes that work on our psychology, that kind of tap into our personal biases, and, you know, misinformation can spread in that way.”
This, for Stea, relates to what is known as the “illusory truth effect”, the tendency to believe something is true when you repeatedly hear about it. As he says, when it comes to our beliefs “familiarity is not so well differentiated from accuracy or the truth. So, we hear things repeatedly, our brains just tend to start to believe them.”
Once Stea became aware of the extent of this situation, he started to take a more active role against misinformation and debunking predatory pseudoscience in the wellness industry. However, while this may seem like a personal mission, Stea believes it is part of his broader code of ethics as a psychologist.
“There’s a Canadian code of ethics for psychologists, and one of the principles in there is responsibility to society,” he explained. “And what that involves is, in part, promoting and practicing evidence-based care and making sure that the public is protected in that manner. The converse of that, or part of that ethical code, in my opinion, also involves the converse, which is to take on debunking mental health misinformation and mental health pseudoscience. So, I believe it’s written into the fabric of our codes of ethics.”
However, this is not an easy thing to do in actuality and may even require a kind of split between one’s clinical role and their wider professional duty. As those working within a clinical setting are expected to respect a client’s autonomy, so addressing their personal beliefs – unless they are clearly dangerous – does not necessarily warrant therapeutic intervention.
Addressing this is delicate, but as Stea says, sometimes respecting a patient’s autonomy requires a practitioner to give them “a choice on the treatments they can have, and if you’re not telling them the truth about the evidence behind [pseudoscientific options like Reiki], or energy healing, then that’s not respecting their autonomy.”
But what about scientists working in fields that do not typically have formal codes of ethics to dictate or guide their conduct? Or, similarly, disciplines that do not necessarily require scientists to engage with the wider public?
It’s not all about dinosaurs
“Today, there’s so much fake news and miscommunication throughout science [that] it’s kind of running riot,” palaeontologist Dr Dean Lomax told IFLScience. It’s reached a “point where […] I feel it’s vitally important that, as an actual scientist, we engage with the public as much as possible.”
Lomax, who specializes in excavating and researching dinosaurs and extinct marine reptiles and is a leading expert on ichthyosaurs, is no stranger to public engagement. He is a multi-award-winning researcher, author, and presenter who has appeared in many documentaries – notably as the co-host of Dinosaur Britain – and regularly speaks at events, including delivering two TED Talks. Through his work, he has encountered many distorted or miscommunicated ideas related to palaeontology that are often circulating online.
“Within the realms of palaeontology, in terms of miscommunication, misinformation and everything like that, [the focus] is often on the age of fossils and similar,” he explained.
This may not sound like much, perhaps just the case of people underestimating or muddling up just how many millions of years old something is. But this type of misunderstanding or misconception can feed into positions related to the age of the Earth or other creationist views that deny the existence of fossils or anything older than about 7,000 years.
These types of debates are well known and have been running for decades. But they are accompanied by more subtle distortions that can, over time, impact the value of scientific research more generally.
I think pretty much nowadays, in my opinion, every scientist really ought to be doing some sort of public outreach, some sort of engagement […] because if you’re not out there challenging what other people are saying about the work, that’s where a lot of misinformation can spread.
Dr Dean Lomax
This slippery issue surrounds the public’s fascination with specific topics. In this context, it’s what Lomax refers to as “the dinosaur effect” – the situation where paleontological research only attracts public attention if it is somehow connected with dinosaurs. Sure, dinosaurs are great, and they have a premium in our cultural interest, but they’re not the only thing that lived millions of years ago.
This bias towards all things “terrible lizards” may lead some academics to either emphasize features of their work that can be associated with dinosaurs in some way (e.g. stressing that a specimen was from the Jurassic era, “the time of dinosaurs”), or for journalists to latch onto those features because they know it will appeal to wider audiences.
Unfortunately, this results in the study’s original value being gradually distorted as the focus remains solely on what is popular. As more social media sites share it, often tweaking it for their own viewers, the distortions become more pronounced.
“When research actually goes viral”, Lomax said, “sometimes it’s going viral for the wrong reasons and it’s not actually the science that you’re trying to talk about.”
Part of the issue here is also the growth in online “science influencers” who are muddying the water. This type of coverage of scientific ideas can be, as Lomax says, “just as bad as some of the other info out there.”
“Some of them gain big followings, but only through sharing others’ work or not doing the necessary background research. Rather, regurgitating things from Wikipedia, already poor journalism of a study, or using click-bait-type posts/videos, which create high engagement. As a result, the public is then led to believe that said person is a real scientist and that they are an authority, which is most definitely not the case.”
An example of misinformation related to palaeontology is the story that pops up every few years related to woolly mammoths being cloned or “brought back” in a few years. It’s a popular story but, as Lomax says, it isn’t likely to actually happen. But the recurring focus on this idea usually ends up distorting understanding of the powers of current genetic technologies and the role of palaeontology, creating the view that we are only a few years away from a Jurassic Park-like reality.
This is obviously not true or realistic, but it also sends the wrong message.
“I mean, in a nutshell, it’s always a case of, rather than wasting the amount of money trying to bring back all the mammoths or other Ice Age fauna, which will never really be true representatives. I’d rather that money was invested in animals that are genuinely at risk of extinction today”.
This is why Lomax believes all scientists should be trained to communicate with the public.
“I think pretty much nowadays, in my opinion, every scientist really ought to be doing some sort of public outreach, some sort of engagement […] because if you’re not physically getting it out there doing public outreach or challenging what other people are saying about the work [then] that’s where a lot of misinformation can spread.”
The challenges of confronting misinformation
Okay, so promoting your research so you can get a message across is one thing, but what about actually addressing misinformation, fake news, or conspiracy theories based on this research?
“I think it all has to be kind of within reason that […] where possible, [you] call out a few bigger profiles, comments or whatever, or posts that are shared on social media,” Lomax explained, but be aware that it can “then lead a slippery slope where you’ll have people who will send a whole bunch of hideous messages, or leave comments and stuff.”
This is the challenge associated with this type of engagement. Addressing distorted reports on scientific work can lead to backlash from people, especially if they are part of a bigger conspiracy theory or inform some aspect of someone’s wider worldview. And these attacks can be nasty.
“I don’t blame people for not wanting to engage in this stuff, because people have been pushed away from social media [because of it],” Stea said. “They’ve been scared off because who the hell wants to get 100 abusive messages a day. It can be really discombobulating for people.”
The playbook often relies on the idea that extraordinary claims require extraordinary evidence, ‘so show me the evidence’. But as soon as you do that, you’re going to start getting Gish galloped by the pseudoscientist. You’re letting them define the context.
Dr Flint Dibble
So, the situation is difficult, but does that mean it is a worthless effort? Well, no, it just means scientists need to be aware of the problems that can come from this activity and to have a plan. This is not new terrain by any means and there is actually a substantial amount of work on the subject published in academic journals. These new sources provide new insights into the mechanisms informing misinformation sharing and ways to address it. However, as with anything, there are different techniques that can be used, some more effective than others.
“I think a lot of what I do is read misinformation, research, and the strategies that scholars have published on, and I think that’s very different from the strategy that people used until fairly recently,” Dibble, explained.
According to Dibble, a lot of people challenging misinformation or conspiracy theories rely on a “playbook” that is a bit out of date now. The approach that is often used is the “burden of proof” argument popularised by Carl Sagan in his Demon-Haunted World, released in 1995.
“That playbook oftentimes relies on the idea that extraordinary claims require extraordinary evidence, ‘so show me the evidence’. But as soon as you do that, you’re going to start getting Gish galloped by the pseudoscientist. You’re letting them define the context and you’re responding to them.”
Instead of “debunking”, Dibble says, researchers should prepare thoroughly and “pre-bunk”. This is effectively a case of getting in the first word. In his debate with Graham Hancock, Dibble insisted on going first and then used “misinformation strategies” from there on.
“My strategy […] was to be able to speak first and lay out what archaeology actually is and how much evidence we have, because that sets the context in their mind. And I think also starting with [images of] sex on Athenian pots, by making it ‘edutainment’, that’s also really important. By making it interesting so people don’t want to flip the channel or whatever, and you keep them in the loop, and not talking down to them. It stays interesting, understandable, and still complex.”
“I think, the biggest key was maintaining my cool. I’ve seen a lot of comments from Graham’s fans, even ones that are still his fans and ones that are not his fans anymore, for really critiquing him, for not bringing any evidence, and for his attacks on the field and on me.”
But ultimately, Dibble argues that being respectful is the best method.
“I actually think the best advice I give to colleagues is when you engage with the public, and in particular in a high profile or maybe negative situation, to treat it like it’s a classroom, in a sense, treat everybody there as if they are your students, meaning you treat them respectfully. I think the lessons we get from pedagogy, how we teach, are very applicable to the public forum. Trying to think through, all right, what is my actual lesson plan and learning goals for this kind of situation, instead of flying at it, you know, off the cuff, because that’s not gonna be as effective.”
The road ahead
The advice from these three scientists comes at an important moment. As of 2025, many major social media sites like Meta and X have abandoned their fact-checking safeguards, and non-experts with known anti-science agendas like long-standing anti-vaxer Robert F Kennedy Jr are in influential government policy positions (in this case, Chief of the US Department of Health and Human Services) that don’t just affect the United States, but globally. The US National Science Foundation recently terminated government research grants into studying misinformation, disinformation, and AI-generated deepfakes, stating it would not support research “that could be used to infringe on the constitutionally protected speech rights of American citizens”, citing an executive order from President Trump. With these major developments, as well as the already turbulent environment left over since the 2020 pandemic, the next few years are likely to see even more problems.
However, there are those prepared to combat misinformation and to defend the need for evidence-based critical thinking. Scientists and other experts who feel they want to take a stance against misinformation may feel isolated in doing so, but clearly there are those out there who are already doing this.
Source Link: Do Scientists Have A Responsibility To Fight Misinformation About Their Subjects?