• Email Us: [email protected]
  • Contact Us: +1 718 874 1545
  • Skip to main content
  • Skip to primary sidebar

Medical Market Report

  • Home
  • All Reports
  • About Us
  • Contact Us

AI Researcher Calls For All AI Experiments To Be Shut Down Immediately

March 30, 2023 by Deborah Bloomfield

An open letter signed by artificial intelligence (AI) researchers, directors of institutes, and CEOs of social media sites, including Elon Musk, has asked for all AI experiments to be paused, immediately, given the “profound risks to society and humanity” if an advanced AI was created without proper management and planning. 

Meanwhile, another researcher writing in Time argues that this isn’t going far enough and that we need to “shut it all down” and prohibit certain tech if humanity is to survive long term.

Advertisement

The open letter says that an “AI summer” where all labs pause any work on anything more powerful than Open AI’s ChatGPT-4 is needed, as in recent months AI researchers have been in an “out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control.”

During the pause, the open letter asks AI labs and experts to come together to develop shared safety protocols for designing AI, which are overseen by independent outside experts. It also suggests AI researchers should work with policy-makers to create systems of oversight for AI, as well as smaller practical steps like watermarking for images created by it.

“Humanity can enjoy a flourishing future with AI. Having succeeded in creating powerful AI systems, we can now enjoy an ‘AI summer’ in which we reap the rewards, engineer these systems for the clear benefit of all, and give society a chance to adapt,” the letter concludes. 

“Society has hit pause on other technologies with potentially catastrophic effects on society. We can do so here. Let’s enjoy a long AI summer, not rush unprepared into a fall.”

Advertisement

The letter was signed by researchers from Google and DeepMind, as well as Apple co-founder Steve Wozniak.

Some are calling it a PR exercise, aimed at bigging up the power of the tech and the potential future dangers, while not addressing real-world problems created by current (and near-future) AIs in the short term.

However, for American computer scientist and lead researcher at the Machine Intelligence Research Institute, Eliezer Yudkowsky, the letter doesn’t go far enough.

“Many researchers steeped in these issues, including myself, expect that the most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die,” he writes in a piece for Time, likening humanity competing with AI to everyone from the 11th Century attempting to fight everyone from the 21st Century.

Advertisement

Yudkowsky believes that currently, we are far behind where we need to be in order to create an AI safely, which won’t eventually lead to humanity’s demise. Catching up to this position could take 30 years. He proposes limiting computing power given to people training AI, and then slowly decreasing the power allocation as algorithms become more efficient to compensate. Essentially though, his policy is to “shut it all down”.

“Progress in AI capabilities is running vastly, vastly ahead of progress in AI alignment or even progress in understanding what the hell is going on inside those systems,” he writes. “If we actually do this, we are all going to die.”

Deborah Bloomfield
Deborah Bloomfield

Related posts:

  1. ECB trims emergency support but insists “no tapering”
  2. Cuba Gooding, rape accuser agree to set aside liability finding against actor
  3. AstraZeneca invests in Imperial’s self-amplifying RNA technology with eye on future drugs
  4. With scenes from space, TV series shines light on team fixing spectrometer

Source Link: AI Researcher Calls For All AI Experiments To Be Shut Down Immediately

Filed Under: News

Primary Sidebar

  • Project Hail Mary Trailer First Look: What Would Happen If The Sun Got Darker?
  • Newly Discovered Cell Structure Might Hold Key To Understanding Devastating Genetic Disorders
  • What Is Kakeya’s Needle Problem, And Why Do We Want To Solve It?
  • “I Wasn’t Prepared For The Sheer Number Of Them”: Cave Of Mummified Never-Before-Seen Eyeless Invertebrates Amazes Scientists
  • Asteroid Day At 10: How The World Is More Prepared Than Ever To Face Celestial Threats
  • What Happened When A New Zealand Man Fell Butt-First Onto A Powerful Air Hose
  • Ancient DNA Confirms Women’s Unexpected Status In One Of The Oldest Known Neolithic Settlements
  • Earth’s Weather Satellites Catch Cloud Changes… On Venus
  • Scientists Find Common Factors In People Who Have “Out-Of-Body” Experiences
  • Shocking Photos Reveal Extent Of Overfishing’s Impact On “Shrinking” Cod
  • Direct Fusion Drive Could Take Us To Sedna During Its Closest Approach In 11,000 Years
  • Earth’s Energy Imbalance Is More Than Double What It Should Be – And We Don’t Know Why
  • We May Have Misjudged A Fundamental Fact About The Cambrian Explosion
  • The Shoebill Is A Bird So Bizarre That Some People Don’t Even Believe It’s Real
  • Colossal’s “Dire Wolves” Are Now 6 Months Old – And They’ve Doubled In Size
  • How To Fake A Fossil: Find Out More In Issue 36 Of CURIOUS – Out Now
  • Is It True Earth Used To Take 420 Days To Orbit The Sun?
  • One Of The Ocean’s “Most Valuable Habitats” Grows The Only Flowers Known To Bloom In Seawater
  • World’s Largest Digital Camera Snaps 2,104 New Asteroids In 10 Hours, Mice With 2 Dads Father Their Own Offspring, And Much More This Week
  • Simplest Explanation For “Anomalous” Signals Coming From Underneath Antarctica Ruled Out
  • Business
  • Health
  • News
  • Science
  • Technology
  • +1 718 874 1545
  • +91 78878 22626
  • [email protected]
Office Address
Prudour Pvt. Ltd. 420 Lexington Avenue Suite 300 New York City, NY 10170.

Powered by Prudour Network

Copyrights © 2025 · Medical Market Report. All Rights Reserved.

Go to mobile version