Site icon Medical Market Report

New Bill To Prevent AI Single-Handedly Launching Nuclear Weapons Proposed In US

A new bill proposed by US lawmakers would prevent artificial intelligence (AI) from being able to singlehandedly launch nuclear weapons without human input, codifying existing Pentagon rules.

While current rules forbid the autonomous launch of nuclear weapons, there aren’t any actual laws that prevent this from happening. With the astronomical rise in AI models in recent years, officials have become concerned that they could sneak their way into the very top-level decision-making of the US military. 

Advertisement

In anticipation of this possibility, Senator Edward Markey (D-Mass.) and Representatives Ted W. Lieu (CA-36), Don Beyer (VA-08), and Ken Buck (CO-04) have introduced a bipartisan bill called Block Nuclear Launch by Autonomous Artificial Intelligence Act that will “safeguard the nuclear command and control process from any future change in policy that allows artificial intelligence (AI) to make nuclear launch decisions”.

It will ensure humans are “in the loop” following an order by the President to launch a nuclear weapon, for either defense or offense. 

“AI technology is developing at an extremely rapid pace,” said Representative Ted Lieu in a statement.

“While we all try to grapple with the pace at which AI is accelerating, the future of AI and its role in society remains unclear. It is our job as Members of Congress to have responsible foresight when it comes to protecting future generations from potentially devastating consequences. That’s why I’m pleased to introduce the bipartisan, bicameral Block Nuclear Launch by Autonomous AI Act, which will ensure that no matter what happens in the future, a human being has control over the employment of a nuclear weapon – not a robot. AI can never be a substitute for human judgment when it comes to launching nuclear weapons.”

Advertisement

The bill is following-through on a recommendation from a 2021 National Security Commission on Artificial Intelligence report that suggested such a law, in the hopes that the US would spearhead the idea for other nuclear powers to follow.  

AI models have no concept of empathy and would not truly understand the impact of a nuclear weapon, so allowing them uncontrolled access to the launch systems could lead to a disaster that could otherwise be averted. For example, Soviet submariner Vasili Arkhipov single-handedly prevented nuclear war when their Captain mistakenly thought that war had broken out between the US and the Soviet Union – had AI been at the helm, the world could look very different today. 

Source Link: New Bill To Prevent AI Single-Handedly Launching Nuclear Weapons Proposed In US

Exit mobile version