Get the FREE DIGITAL BOOK: The Case for Killer Robots
Mind Matters Reporting on Natural and Artificial Intelligence
Dozens of Drones Swarm in the Cloudy Sky.
Dozens of Drones Swarm in the Cloudy Sky.

Meet the U.S. Army’s New Drone Swarms

As with insects, only a few drones need survive to accomplish their task

The US Army is developing a “swarm” of autonomous AI drones to protect combat helicopters. The swarm is modeled after social swarming insects like bees and ants who protect their queen. A drone nest protects the queen helicopter at all costs.

The protective swarm’s tasks will range from sophisticated electronic warfare to acting as false targets (decoys) for incoming missiles. They will carry out theses tasks autonomously:

Goals and tasks must be assigned by a person, but the way of their implementation, reaching the target or navigation and flight control is to be “in the hands of” advanced software and artificial intelligence.

TOC, “The US Army is developing a ‘pocket’ swarm of combat drones” at

Here’s what a small swarm can do:

The most chilling achievable AI weaponry of the near future is autonomously operating offensive drone swarms. Such swarms are depicted in the slick Black Mirror-flavored video Slaughterbots. Each drone is equipped with facial recognition software and an embedded explosive charge. When a face is recognized, the explosive charge is fired into the target’s brain.

Slaughterbot-like drones were also featured in the 2019 action-adventure movie Angel Has Fallen, where an attack drone swarm kills all of the members of the US President’s entourage except for the President (played by Morgan Freeman) and one of his protectors.

The left-wing producers of the original Slaughterbot video depict military contractors as stereotypical heartless beasts, only interested in killing and a fast buck, and calls for a ban on all autonomous weapons. That will be about as effective as Neville Chamberlain’s agreement with Adolf Hitler on “peace for our time.”

One problem is that drones are cheap and easy to arm and deploy. In 2015, a teenage tinkerer jury-rigged a drone with a handgun. The gun was fired from the flying drone using remote control. If a lone teenager can do this, think what almost any nation with a modicum of technical expertise and publicly available AI software can do.

Offensive drone swarms are chilling because of their survivability. Kick over an ant hill and stomp as much as you want. When you come back in a few weeks, the ant colony has survived and rebuilt their ant hill. An attacking swarm of drones can likewise be decimated by ninety percent and the surviving members can still accomplish their mission.

What technology can counter an attack swarm of drones? Security analyst Paul Scharre notes that, in some cases, something as simple as chicken wire can bar small drone swarms from an area. Meanwhile, Israel has developed a laser weapon capable of destroying a single flying drone. Such a weapon might be generalized to engage a small swarm of drones. But operation on foggy days still seems questionable.

Swarm-on-swarm dogfights are another possible defense against drones, But they would require autonomous reaction times far shorter than can be achieved by a human. Total autonomy would become a necessity.

What is the most dangerous aspect of the Army’s development of a defensive autonomous drone swarm? Unintended outcomes. To ensure that they don’t happen (as well as can reasonably be expected), the Army requires expert programmers and extensive testing under different scenarios and in numerous environments.

Swarming drones might be chilling, but the US military must consider them. China has invested $30 billion dollars in AI research. Russia’s leader, Vladimir Putin, has said “Whoever becomes the leader in … [AI] will become the ruler of the world.” To remain militarily viable, the US must continue to develop AI embedded weapons.

Note: Robert J. Marks is the author of The Case for Killer Robots.

Further reading:

Slaughterbots: Is it ethical to develop a swarm of killer AI drones? (Robert J. Marks)


Slaughterbots: How far is too far? And how will we know if we have crossed a line? (Eric Holloway)


Robert J. Marks II

Director, Senior Fellow, Walter Bradley Center for Natural & Artificial Intelligence
Besides serving as Director, Robert J. Marks Ph.D. hosts the Mind Matters podcast for the Bradley Center. He is Distinguished Professor of Electrical and Computer Engineering at Baylor University. Marks is a Fellow of both the Institute of Electrical and Electronic Engineers (IEEE) and the Optical Society of America. He was Charter President of the IEEE Neural Networks Council and served as Editor-in-Chief of the IEEE Transactions on Neural Networks. He is coauthor of the books Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks (MIT Press) and Introduction to Evolutionary Informatics (World Scientific). For more information, see Dr. Marks’s expanded bio.

Meet the U.S. Army’s New Drone Swarms

Skip to toolbar Log Out