• Email Us: [email protected]
  • Contact Us: +1 718 874 1545
  • Skip to main content
  • Skip to primary sidebar

Medical Market Report

  • Home
  • All Reports
  • About Us
  • Contact Us

Could Adding “Trust” And “Distrust” Buttons Be The Future Of Social Media?

June 7, 2023 by Deborah Bloomfield

It can be hard to know what to trust these days. With the rise of fake news and misinformation on social media, you could be forgiven for feeling unsure about content, especially when it is popular. But a new experimental study has shown that the addition of a “trust” and “distrust” button, to accompany the existing “like” button, could be an important step in combating inaccurate posts.

The issue here is about incentivizing accuracy while discouraging misinformation. Researchers from UCL have found that incentivizing accuracy can cut the reach of false information by half.

Advertisement

“Over the past few years, the spread of misinformation, or ‘fake news’, has skyrocketed, contributing to the polarisation of the political sphere and affecting people’s beliefs on anything from vaccine safety to climate change to tolerance of diversity,” Professor Tali Sharot said in a statement. “Existing ways to combat this, such as flagging inaccurate posts, have had limited impact.”

Part of the challenge is that users are often rewarded for sharing fake information by receiving “shares” and “likes”, whereas true content may be less popular.

“Here, we have designed a simple way to incentivise trustworthiness, which we found led to a large reduction in the amount of misinformation being shared.”

In a previous study, Professor Sharot and colleagues found that people are more likely to share information they had seen already, suggesting that the repetition of misinformation was regarded as a sign of its accuracy. So, in this latest study, the team sought to test potential ways to combat this.

Advertisement

They examined a simulated social media platform used by 951 participants across six experiments. The platform operated like most regular social media platforms. It allowed users to share news articles, only half of which were accurate, which other users could respond with the usual “like” or “dislike” reactions, as well as the option to repost the content. But in some versions of the experiment, users could also react with “trust” and “distrust” buttons.

The results showed that the incentive structure worked well, as people relied more heavily on the trust/distrust buttons than they did on regular like/dislike ones. Essentially, the inclusion of these new reaction buttons – what the authors referred to as social “carrots” and “sticks” – turned trustworthiness and validity into socially desirable actions. Additional analysis using computational modeling showed that the introduction of the trust/distrust buttons also led participants to be more discerning when it came to choosing what to repost.

Interestingly, the researchers also found that participants who used the version of platform with the new trust/distrust buttons ended up with more accurate beliefs as well.

“Buttons indicating the trustworthiness of information could easily be incorporated into existing social media platforms, and our findings suggest they could be worthwhile to reduce the spread of misinformation without reducing user engagement,” co-lead author and PhD student Laura Globig added.

Advertisement

“While it’s difficult to predict how this would play out in the real world with a wider range of influences, given the grave risks of online misinformation, this could be a valuable addition to ongoing efforts to combat misinformation.”

The study is published in eLife.

Deborah Bloomfield
Deborah Bloomfield

Related posts:

  1. 1 change that can fix the VC funding crisis for women founders
  2. MLB roundup: Angels put crimp in Mariners’ playoff hopes
  3. Police Claim Woman Attacked Them With Angry Bees During An Eviction
  4. Why Do Airplane Window Shades Have To Be Up During Takeoff And Landing?

Source Link: Could Adding "Trust" And "Distrust" Buttons Be The Future Of Social Media?

Filed Under: News

Primary Sidebar

  • First-Of-Its-Kind Evidence Shows Bees Can Learn “Morse Code” – Well, Kinda
  • Humans Have A “Seventh Sense” That Lets You Touch Things From A Distance
  • The Longest Place Name Has 111 Letters – And It’s Visited By Millions Of People Each Year
  • We Now Know Why Neanderthal Faces Looked So Different To Our Own
  • Why Does Africa Have So Many Of The World’s Largest Land Animals?
  • This “Ant-Mimicking” Spider Produces Its Own Kind Of Milk And Nurses Its Babies
  • 1972 Was The Longest Year In Modern History – Here’s Why
  • Why Did “Magic Mushrooms” Evolve To Be Hallucinogenic – What’s In It For The Mushrooms?
  • Why Can’t You Domesticate All Wild Animals? The Process Relies On 6 Characteristics Few Mammals Possess
  • Meet Some Of Earth’s Mightiest Predators
  • Canada Officially Loses Its Measles Elimination Status After Nearly 30 Years. The US Is Not Far Behind
  • Two “Anomalies” Detected In Egypt’s Menkaure Pyramid Using Electrical Resistance Tomography
  • Invasive “Tree Of Heaven” Unleashes Hell As “Double Invasion” Sweeps Across Virginia
  • Hamman’s Crunch: A Man Covered His Nose And Mouth Whilst Sneezing And Ended Up In Hospital
  • “One Of The Most Beautiful Experiments In Evolutionary Biology”: What The Peppered Moth Taught Us About Evolution
  • Why Do Microwaved Eggs Explode When You Bite Into Them?
  • First-Ever At-Home LSD Microdosing Trial For Depression Sees 60 Percent Improvement In Symptoms
  • People Are Just Learning What A Baby Turkey Is Called
  • Enceladus’s North Pole Is Leaking Heat, Indicating Its Ocean Is Ancient And Boosting Prospects For Life
  • Speaking Multiple Languages May Be A Secret Weapon Against The Ravages Of Old Age
  • Business
  • Health
  • News
  • Science
  • Technology
  • +1 718 874 1545
  • +91 78878 22626
  • [email protected]
Office Address
Prudour Pvt. Ltd. 420 Lexington Avenue Suite 300 New York City, NY 10170.

Powered by Prudour Network

Copyrights © 2025 · Medical Market Report. All Rights Reserved.

Go to mobile version