Are We Headed for Another Expensive Nuclear Arms Race? Could Be. 昂貴核武競賽 可能再次上演
After the recent death of the treaty covering intermediate-range missiles, a new arms race appears to be taking shape, drawing in more players, more money and more weapons at a time of increased global instability and anxiety about nuclear proliferation.
The arms control architecture of the Cold War, involving tens of thousands of nuclear weapons, was laboriously designed over years of hard-fought negotiations between two superpowers — the United States and the Soviet Union. The elaborate treaties helped keep the world from nuclear annihilation.
Today, those treaties are being abandoned by the United States and Russia just as new strategic competitors not covered by the Cold War accords — like China, North Korea and Iran — are asserting themselves as regional powers and challenging U.S. hegemony.
The dismantling of "arms control," a Cold War mantra, is now heightening the risks of a new era when nuclear powers like India and Pakistan are clashing over Kashmir, and when nuclear Israel feels threatened by Iran, North Korea is testing new missiles, and other countries like Saudi Arabia are thought to have access to nuclear weapons or be capable of building them.
The consequence, experts say, is likely to be a more dangerous and unstable environment, even in the near term, that could precipitate unwanted conflicts and demand vast new military spending among the world's biggest powers, including the United States.
"If there's not nuclear disarmament, there will be proliferation," said Joseph Cirincione, a nuclear analyst and president of the Ploughshares Fund, a global security foundation. "If big powers race to build up their arsenals, smaller powers will follow."
"As long as the big boys cling to their toys, others will want them," he added, quoting the former head of the International Atomic Energy Agency, Mohamed ElBaradei.
Not only are the big boys clinging to them, there are more big boys now, and they want more toys.
For Washington, China is seen as a rising strategic rival and competitor. And the United States is moving to increase its military presence and missile deployments in Asia, as a deterrent against a more aggressive Beijing, which has vastly expanded and modernized its stock of medium-range missiles that can hit U.S. ships, as well as Taiwan.
At the same time, President Donald Trump's national security adviser, John R. Bolton, has talked about letting the last strategic-arms control treaty, New START, die in February 2021, without extending it another five years, as foreseen in the accord, which was signed under President Barack Obama.
Early in the Afghanistan War, Army Rangers hunting Taliban fighters along the Pakistan border saw a goatherd with a radio, probably reporting their position to the Taliban. Under the rules of war, soldiers can shoot someone informing enemy forces of their location, but these men saw that the goatherd was just a girl.
If she had come into the sights of the kind of autonomous robot or drone now under development, rather than of trained snipers, it might not have made the distinction between target and child, and killed her, according to Paul Scharre, who was leading the Rangers that day.
Scharre, author of "Army of None: Autonomous Weapons and the Future of War," recounted this episode in a speech this year at Stanford's Center for International Security and Cooperation, laying out the stakes as the artificial intelligence revolution spreads further onto the battlefield.
"How would you design a robot to know the difference between what is legal and what is right?" he asked. "And how would you even begin to write down those rules ahead of time? What if you didn't have a human there, to interpret these, to bring that whole set of human values to those decisions?"
For now, these are hypothetical questions. Two senior Pentagon officials, who spoke to The Times on background because much of their work on artificial intelligence is classified, say the United States is "not even close" to fielding a completely autonomous weapon.
But three years ago, Azerbaijani forces used what appeared to be an Israeli-made kamikaze drone called a Harop to blow up a bus carrying Armenian soldiers. The drone can automatically fly to a site, find a target, dive down and detonate, according to the manufacturer. For now, it is designed to have human controllers who can stop it.
Not long after that in California, the Pentagon's Strategic Capabilities Office tested 103 unarmed Perdix drones which, on their own, were able to swarm around a target. "They are a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature," the office's director at the time, William Roper, said in a Defense Department statement.
As the ability of systems to act autonomously increases, those who study the dangers of such weapons, including the U.N. Group of Governmental Experts, fear that military planners may be tempted to eliminate human controls altogether.