Artificial intelligence has been used to advance our military significantly in recent years.
From surveillance drones to fitness trackers to maintenance schedules, AI has been a major part of the upgraded military.
Now, however, reports are surfacing that the Pentagon is pushing AI-enabled weapons, and it is creating growing concern.
The Next Step
A CBS News report recently stated that the Pentagon is pushing forward with AI-enabled autonomous vehicles within the next two years in order to keep pace with the Chinese.
The initiative is reportedly called the “Replicator.”
Deputy Secretary of Defense Kathleen Hicks stated that the program seeks to "galvanize progress in the too-slow shift of U.S. military innovation to leverage platforms that are small, smart, cheap, and many.”
As this program accelerates, there is little doubt this will lead to autonomous lethal weapons, and that is raising some serious concerns.
Critics of the program are calling these weapons “killer robots,” or “slaughterbots.”
To this point, these weapons have not been used in combat, but it would appear that it is only a matter of time before that happens.
No treaties exist to govern these weapons as we have with nuclear devices, and there is no ethics program in place to guide how they are used.
Anna Hehir, who leads autonomous weapons system research for the advocacy organization Future of Life Institute (FLI), stated, “It’s really a Pandora’s box that we’re starting to see open, and it will be very hard to go back.
“I would argue for the Pentagon to view the use of AI in military use as on par with the start of the nuclear era.
“So this is a novel technology that we don’t understand. And if we view this in an arms race type of way, which is what the Pentagon is doing, then we can head to global catastrophe.”
Michael Klare, secretary for the Arms Control Association’s board of directors and a senior visiting fellow researching emerging technologies said the obvious in expressing concerns that these automated weapons could create an unintended conflict.
He stated, “The multiplication of these kinds of autonomous devices will increase the potential for unintended or accidental escalation of conflict across the nuclear threshold [that could] trigger a nuclear war.
“We fear that there’s a lot of potential for that down the road. And that’s not being given careful consideration by the people who are fielding these weapons.”
This just reminds me of scenes from “Edge of Tomorrow” or something from the “Matrix” franchise where war machines start to take over the world.