The
war in Ukraine, if it comes (or came overnight — these things move fast), may well be one of the last wars we can process with human perception.
Much as the Boer War in South Africa at the turn of the twentieth century offered a savage preview of the weapons and tactics to come in the Great War, today’s conflicts foretell how war will be experienced in the coming decade. Unmanned and in some cases autonomous drones have already been used offensively in
North Africa and
Syria. Moore’s Law, which so far seems to apply equally to drone technology as it did to semiconductors, implies that the capabilities of autonomous hardware will increase exponentially while costs decrease. Full autonomous war is just around the corner.
The Pentagon is certainly
aware of the future, but is it positioned to succeed? Drones and autonomous weapons have been studied for decades, and concerns about doctrine have been just as forthcoming.
Paul Scharre at the
Center for New American Security wrote
a great book titled
Army of None exploring the intricacies of the future of this warfare back in 2018. Yet, it’s one thing to ponder and strategize — another to actually experience and fight such a war firsthand.
Sue Halpern of
The New Yorker had a great story this week on
how this revolution is going to be experienced. She explores
DARPA’s autonomous dogfighting challenge, with a specific focus on how human pilots are adapting to this new future of artificial intelligence:
In one scrimmage, his plane and the adversary’s chased each other around and around—on the screen, it looked like water circling a drain. The pilot told [Katharine] Woodruff [a researcher working with SoarTech] that, though he let the A.I. keep fighting, “I know it is not going to gun this guy anytime soon. In real life, if you keep going around like that you’re either going to run out of gas or another bad guy will come up from behind and kill you.” In an actual battle, he would have accepted more risk in order to get to a better offensive angle. “I mean, A.I. should be so much smarter than me,” he said. “So if I’m looking out there, thinking I could have gained some advantage here and A.I. isn’t, I have to start asking why.”
Latent throughout the piece is the same laconic dread that has animated American manufacturing for the past few decades — the sense that machines are taking over, production is being outsourced to “others,” that a way of life is vanishing. The Fordism that brought the postwar economic golden era in America is today’s wasteland of Fentanyl and social bile. Dreams of being a fighter pilot will soon go the way of securing a high-paying union manufacturing job in a prosperous industrial Midwest town.
Yet despite awareness of the future, one senses that nostalgia and inertia are still driving U.S. security policy. Far from punching ahead, the Pentagon’s goal with the
Air Combat Evolution program is to support a human-computer hybrid model where humans are “battle managers” overseeing the work of their AI fighting counterpart. Yet, it’s already obvious that the artificial intelligence has little need for the human intelligence sitting in the cockpit:
Trust will also be crucial because, with planes flying at speeds of up to five hundred miles an hour, algorithms won’t always be able to keep pilots in the loop. [Peter Hancock, a psychology professor at the University of Central Florida], calls the discrepancy in reaction time “temporal dissonance.” As an analogy, he pointed to air bags, which deploy within milliseconds, below the threshold of human perception. “As soon as you put me in that loop,” he said, “you’ve defeated the whole purpose of the air bag, which is to inflate almost instantaneously.”
The dogfighting trials are early and ongoing, but AI pilots are arriving quickly. In fact, we will almost certainly have deployed AI fighter planes earlier than autonomous vehicles on our roads. Our cars have to halt at fallen stop signs, weave around construction sites, and navigate the intricacies of pedestrians, delivery vans, bikes, and more all without making a single, tragic mistake. An AI fighter plane? It may be hesitant to target and neutralize an enemy plane, but the constraints of land blow away in the air.
There is a discontinuity coming in war, where the costs won’t be born by the humans fighting, but exclusively by the humans who are being fought over. If, as some political scientists believe, the elimination of an active draft in the United States encouraged military adventurism by lowering the costs of going to war, the same will hold even more true when there aren’t much of any costs at all. Even worse, such a calculus will equally apply to America’s adversaries, from China and Russia to an insurgent.
Swarms of autonomous, cheap, violent autonomous weapons that can’t be stopped with conventional hardware. It’s the new Fordism of defense, but unfortunately, the United States is still obsessed with the old Fordism.
The Pentagon’s testing report for the new
Gerald R. Ford class of aircraft carrier,
which was obtained by Bloomberg this week and is expected to be submitted later this year to Congress, notes widespread shortcomings in the ship’s ability to defend itself as well as with its operational systems. The ship, which cost about $13 billion and first started construction in 2005, could see operational deployment later this year.
Compare that prodigious spend on a single vessel to the entire autonomous warfare budget. According to Halpern at the New Yorker, the Pentagon will spend $1 billion on AI this year.
That’s the fundamental tension of defense acquisition today. The multi-decade commitments that advanced warfighting platforms like the Gerald R. Ford require are slamming straight into the Moore’s Law of autonomous machines. The Ford’s construction was underway prior to the introduction of the iPhone, and it’s still not in service because experimental Chinese weapons like supersonic missiles will likely prove effective in neutering it (an exercise that Beijing has already
made a clear priority).
There’s the canonical line attributed to William Gibson about the future not being equally distributed, but what happens when the future is staring you right in the face and nostalgic blindness prevents a leap to the next generation? It’s a Brave New World out there, and the Fordism of the Navy’s past and all the rest of defense is running up against the Fordism of the future. Cheap and abundant will beat expensive and rare.
There are some positive motions to adapt to these new threats. Lux portfolio company
Anduril announced a nearly $1 billion contract with the
U.S. Special Operations Command (SOCOM) on an indefinite delivery indefinite quantity basis (which is government speak for an open-ended, fixed-time contract). Anduril will work as a systems integration partner with SOCOM on countering unmanned threats.
Down payments on the future are good. But we need more, faster — just like the AI weapons systems that will buzz around past the barrier of human perception.