• Email Us: [email protected]
  • Contact Us: +1 718 874 1545
  • Skip to main content
  • Skip to primary sidebar

Medical Market Report

  • Home
  • All Reports
  • About Us
  • Contact Us

The FDA should regulate Instagram’s algorithm as a drug

September 15, 2021 by David Barret Leave a Comment

Daniel Liss
Contributor

Share on Twitter

Daniel Liss is the founder and CEO of Dispo, the digital disposable camera social network.
More posts by this contributor

  • Move fast and break Facebook: A bull case for antitrust enforcement

The Wall Street Journal on Tuesday reported Silicon Valley’s worst-kept secret: Instagram harms teens’ mental health; in fact, its impact is so negative that it introduces suicidal thoughts.

Thirty-two percent of teen girls who feel bad about their bodies report that Instagram makes them feel worse. Of teens with suicidal thoughts, 13% of British and 6% of American users trace those thoughts to Instagram, the WSJ report said. This is Facebook’s internal data. The truth is surely worse.

President Theodore Roosevelt and Congress formed the Food and Drug Administration in 1906 precisely because Big Food and Big Pharma failed to protect the general welfare. As its executives parade at the Met Gala in celebration of the unattainable 0.01% of lifestyles and bodies that we mere mortals will never achieve, Instagram’s unwillingness to do what is right is a clarion call for regulation: The FDA must assert its codified right to regulate the algorithm powering the drug of Instagram.

The FDA should consider algorithms a drug impacting our nation’s mental health: The Federal Food, Drug and Cosmetic Act gives the FDA the right to regulate drugs, defining drugs in part as “articles (other than food) intended to affect the structure or any function of the body of man or other animals.” Instagram’s internal data shows its technology is an article that alters our brains. If this effort fails, Congress and President Joe Biden should create a mental health FDA.

Researchers can study what Facebook prioritizes and the impact those decisions have on our minds. How do we know this? Because Facebook is already doing it — they’re just burying the results.

The public needs to understand what Facebook and Instagram’s algorithms prioritize. Our government is equipped to study clinical trials of products that can physically harm the public. Researchers can study what Facebook privileges and the impact those decisions have on our minds. How do we know this? Because Facebook is already doing it — they’re just burying the results.

In November 2020, as Cecilia Kang and Sheera Frenkel report in “An Ugly Truth,” Facebook made an emergency change to its News Feed, putting more emphasis on “News Ecosystem Quality” scores (NEQs). High NEQ sources were trustworthy sources; low were untrustworthy. Facebook altered the algorithm to privilege high NEQ scores. As a result, for five days around the election, users saw a “nicer News Feed” with less fake news and fewer conspiracy theories. But Mark Zuckerberg reversed this change because it led to less engagement and could cause a conservative backlash. The public suffered for it.

Facebook likewise has studied what happens when the algorithm privileges content that is “good for the world” over content that is “bad for the world.” Lo and behold, engagement decreases. Facebook knows that its algorithm has a remarkable impact on the minds of the American public. How can the government let one man decide the standard based on his business imperatives, not the general welfare?

Upton Sinclair memorably uncovered dangerous abuses in “The Jungle,” which led to a public outcry. The free market failed. Consumers needed protection. The 1906 Pure Food and Drug Act for the first time promulgated safety standards, regulating consumable goods impacting our physical health. Today, we need to regulate the algorithms that impact our mental health. Teen depression has risen alarmingly since 2007. Likewise, suicide among those 10 to 24 is up nearly 60% between 2007 and 2018.

It is of course impossible to prove that social media is solely responsible for this increase, but it is absurd to argue it has not contributed. Filter bubbles distort our views and make them more extreme. Bullying online is easier and constant. Regulators must audit the algorithm and question Facebook’s choices.

When it comes to the biggest issue Facebook poses — what the product does to us — regulators have struggled to articulate the problem. Section 230 is correct in its intent and application; the internet cannot function if platforms are liable for every user utterance. And a private company like Facebook loses the trust of its community if it applies arbitrary rules that target users based on their background or political beliefs. Facebook as a company has no explicit duty to uphold the First Amendment, but public perception of its fairness is essential to the brand.

Thus, Zuckerberg has equivocated over the years before belatedly banning Holocaust deniers, Donald Trump, anti-vaccine activists and other bad actors. Deciding what speech is privileged or allowed on its platform, Facebook will always be too slow to react, overcautious and ineffective. Zuckerberg cares only for engagement and growth. Our hearts and minds are caught in the balance.

The most frightening part of “The Ugly Truth,” the passage that got everyone in Silicon Valley talking, was the eponymous memo: Andrew “Boz” Bosworth’s 2016 “The Ugly.”

In the memo, Bosworth, Zuckerberg’s longtime deputy, writes:

“So we connect more people. That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good.”

Zuckerberg and Sheryl Sandberg made Bosworth walk back his statements when employees objected, but to outsiders, the memo represents the unvarnished id of Facebook, the ugly truth. Facebook’s monopoly, its stranglehold on our social and political fabric, its growth at all costs mantra of “connection,” is not de facto good. As Bosworth acknowledges, Facebook causes suicides and allows terrorists to organize. This much power concentrated in the hands of one corporation, run by one man, is a threat to our democracy and way of life.

Critics of FDA regulation of social media will claim this is a Big Brother invasion of our personal liberties. But what is the alternative? Why would it be bad for our government to demand that Facebook accounts to the public its internal calculations? Is it safe for the number of sessions, time spent and revenue growth to be the only results that matters? What about the collective mental health of the country and world?

Refusing to study the problem does not mean it does not exist. In the absence of action, we are left with a single man deciding what is right. What is the price we pay for “connection”? This is not up to Zuckerberg. The FDA should decide.

Source Link The FDA should regulate Instagram’s algorithm as a drug

David Barret
David Barret

Related posts:

  1. Soccer-PSG sign two-year deal with fashion house Dior
  2. A curtain divides male, female students as Afghan universities reopen
  3. G20 urges COVID help for poor states, but short on new commitments
  4. China’s Aug export growth unexpectedly picks up speed, imports solidly up

Filed Under: News

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Primary Sidebar

  • New Species Of Early Human Lived Alongside The Oldest Known Homo, We Still Don’t Fully Know What Long COVID Actually Is, And Much More This Week
  • New AI Model May Predict Success Of Future Fusion Experiments, Saving Money And Fuel
  • Orange Crocodiles, New Human Species, And Death By Meteorite
  • The World’s Largest Terrestrial Carnivore Has Clear Fur And Black Skin, But You Wouldn’t Know It
  • Deep-Sea Explorers Found A Sunken Whale Carcass – And Watched A Wild Banquet Unfold
  • Does Jupiter Have A Solid Core, And If So, How Big Is It?
  • Trump’s Executive Order To Slash Environmental Regulations For Space Launches: We Look At The Risks And Realities
  • An Underwater Volcano Off The US Coast Is Set To Erupt in 2025, Raising Excitement And Worry
  • Hate Doubling Back On Yourself? Psychologists Have Described A New Bias That May Explain Why
  • A New View Of The “Cosmic Grapes” Is Challenging Our Theories Of How Galaxies Form
  • Ann Hodges: The Only Confirmed Person To Be Hit By A Meteorite And Live
  • Massive Offshore Canyon Expedition Discovers Barbie Lobsters, Sea Pigs, And 40 Potential New Species
  • The Pleiades Will Dance With The Moon This Weekend
  • Tennis Player Gets Public Confused With Autograph About The Fermi Paradox
  • Woman Unearths 2.3 Carat Diamond For Her Future Engagement Ring In State Park
  • RFK Jr Wanted A Journal To Retract This Massive Study On Aluminum In Vaccines. It Refused
  • Can You See The Frog In This Photo? Incredible Camouflage Shows Wildlife Survival Strategy
  • Do Crab-Eating Foxes Actually Eat Crabs?
  • Death Valley’s “Racing Rocks” Inspire Experiment To Make Ice Move On Its Own
  • Parasite “Cleanses”: Are We Riddled With Worms Or Is This Just The Latest Bogus Fad?
  • Business
  • Health
  • News
  • Science
  • Technology
  • +1 718 874 1545
  • +91 78878 22626
  • [email protected]
Office Address
Prudour Pvt. Ltd. 420 Lexington Avenue Suite 300 New York City, NY 10170.

Powered by Prudour Network

Copyrights © 2025 · Medical Market Report. All Rights Reserved.

Go to mobile version