Subscribe to our Newsletter

We saw a demo of the new AI system powering Anduril’s vision for war

I was here to examine the pitch being made by Anduril, other companies in defense-tech, and growing numbers within the Pentagon itself: A future “great power” conflict—military jargon for a global war involving competition between multiple countries—will not be won by the entity with the most advanced drones or firepower, or even the cheapest firepower. It will be won by whichever entity can sort through and share information the fastest. And that will have to be done “at the edge” where threats arise, not necessarily at a command post in Washington. 

“You’re going to need to really empower lower levels to make decisions, to understand what’s going on, and to fight,” Anduril’s CEO Brian Schimpf says. “That is a different paradigm than today,” where information flows poorly among people on the battlefield and decision makers higher up the chain. 

To show how it will fix that, Anduril walked me through an exercise of how its system would take down an incoming drone threatening a base of the US military or its allies (the scenario at the center of Anduril’s new partnership with OpenAI). It began with a truck in the distance, driving toward the base. The AI-powered Sentry tower automatically recognized the object as a possible threat, highlighting it as a dot on one of the screens. Anduril’s software, called Lattice, sent a notification asking the human operator if he would like to send a Ghost drone to monitor. After a click of his mouse, the drone piloted itself autonomously toward the truck, as information on its location gathered by the Sentry was shared to the drone by the software.

The truck disappeared behind some hills, so the Sentry tower camera that was initially trained on it lost contact. But the surveillance drone had already identified it, so its location stayed visible on the screen. We watched as someone in the truck got out and launched a drone, which Lattice again labeled as a threat. It asked the operator if he’d like to send a second attack drone, which then piloted autonomously and locked onto the threatening drone. With one click, it could be instructed to fly into it fast enough to take it down. (We stopped short here, since Anduril isn’t allowed to actually take down drones at this test site.) The entire operation could have been managed by one person with a mouse and computer.

Anduril is building on these capabilities further by expanding Lattice Mesh, a software suite that allows other companies to tap into Anduril’s software and share data, the company announced today. More than 10 companies are now building their hardware—everything from autonomous submarines to self-driving trucks—into the system, and Anduril has released a software development kit to help them do so. Military personnel operating hardware can then “publish” their own data to the network and “subscribe” to receive data feeds from other sensors in a secure environment. On December 3, the Pentagon’s Chief Digital and AI Office awarded a three-year contract to Anduril for Mesh. 

Anduril’s offering will also join forces with Maven, a program operated by defense-data giant Palantir that fuses information from different sources like satellites and geolocation data. It’s the project that led Google employees in 2018 to protest against working in warfare. Anduril and Palantir announced on December 6 that the military will be able to use the Maven and Lattice systems together. 

The aim is to make Anduril’s software indispensable to decision makers. It also represents a massive expansion of how the military is currently using AI. You might think the Department of Defense, advanced as it is, already has this level of connectivity between its hardware. We have some semblance of it in our daily lives, where phones, smart TVs, laptops and other devices can talk to each other and share information. But for the most part, the Pentagon is behind.

“There’s so much information in this battle space, particularly with the growth of drones, cameras and other types of remote sensors, where folks are just sopping up tons of information,” says Zak Kallenborn, a warfare analyst who works with the Center for Strategic and International Studies. Sorting through to find the most important information is a challenge. “There might be something in there, but there’s so much of it that we can’t just set a human down and to deal with it,” he says. 

Leave a Reply

Your email address will not be published. Required fields are marked *