The shock drones, they worried, would stun innocent students or be abused by hackers, vandals or the police. Even if deployed appropriately, they may not be enough to take a gunman out. And the problem at Uvalde, some noted, wasn’t a lack of firepower: Nineteen officers had waited outside the classroom’s door for 47 minutes, wrongly believing the children inside were no longer in danger.
“It’s such an obviously bad idea to use these in the context of schools. I mean, it’s absurd,” said Ryan Calo, one of nine members of Axon’s artificial-intelligence ethics advisory board who resigned to protest the company’s pursuit of the idea. “You cannot address these horrific national tragedies … by throwing a Taser on a drone.”
Critics said the idea cast a spotlight on the security theater that routinely colors the nation’s response to mass shootings, promising an unfounded sense of safety, rather than actual safety, for a tragedy that happens in the United States far more often than anywhere else on earth.
Instead of focusing on guns, they argue, companies have pushed lawmakers to focus everywhere else, selling bulletproof backpacks, school surveillance software, facial recognition scanners and other systems they argue are reactive, problematic and ineffective to stopping future massacres.
Axon, which manufactures a variety of Tasers under the general rubric “energy weapons,” declined to make any executives available for an interview. Rick Smith, its founder and chief, said in a statement Sunday that the project’s response had “provided us with a deeper appreciation of the complex and important considerations” relating to shock drones in schools and added, “I acknowledge that our passion for finding new solutions to stop mass shootings led us to move quickly to share our ideas.”
Though he had previously suggested the system might be operational in two years, he said in the statement that the idea is still “a long way off” and that the company still needs to research whether such drones are “even viable.”
Smith added that it was “unfortunate” that board members had resigned before the company “had a chance to address their technical questions” and that it would continue to “seek diverse perspectives” to advise them on other technological ideas.
But in a statement Monday, the resigned board members said the drone had “no realistic chance of solving the mass shooting problem Axon now is prescribing it for, only distracting society from real solutions.”
“Before Axon’s announcement, we pleaded with the company to pull back,” the members said. “But the company charged ahead in a way that struck many of us as trading on the tragedy of the Uvalde and Buffalo shootings. … [It] is more than any of us can abide.”
Axon has become one of the biggest police-contracting companies in the United States, thanks to its sale of body-worn cameras and Taser weapons, which fire electroshock barbs that can stun a person into submission.
Axon advertises Tasers as “less-lethal,” though a USA Today investigation last year said more than 500 people have died shortly after being shocked. Police officers reaching for the pistol-shaped weapons have also mistakenly drawn their handguns, including in the fatal shooting last year of Daunte Wright.
The company convened its AI ethics board in 2018 as it considered and ultimately declined to enable facial recognition on its body cameras, which critics worried could lead to dangerous misidentifications or the automated surveillance of protests or other public events. “We don’t want to create an Orwellian state just to make a buck,” Smith said then in an interview with The Washington Post.
The board’s advice is not binding, and the company is free to ignore it. But its independent mix of paid technical and legal experts thought they had held some productive discussions over the years with Axon as the company pursued license-plate scanners and other surveillance tools, said Calo, a University of Washington professor who researches technology and the law.
Roughly a year ago, Axon had asked the board whether shock drones could be ethically deployed in a scenario where officers needed remote-strike capabilities and feared for their lives. After deliberating, the board said in a statement that the company would need to implement safeguards to make the idea even “remotely plausible.”
The board voted last month that the company should not move forward with the idea, saying the weaponized drones could increase how often police resort to force “in over-policed communities and communities of color.” Members were preparing a full report, scheduled for release this fall, on whether the project should be marketed to police at all.
Members were surprised, then, when Smith on Thursday announced that the company was “officially beginning development” of a shock drone that could be used in a much more expansive role, to “stop” school shootings, with promises of “threat incapacitation in less than 60 seconds.”
In a video announcement featuring slow-motion footage of a drone firing a dart, Smith said the company had already built test systems and had started on the design phase for a system he projected would take about two years to create. In concept renderings released by the company, the quadcopter drone is shown as having four cameras, a dart-firing barrel, a loudspeaker and a “precision aim-point laser.”
“I am done waiting for politicians to solve the problem. So we’re going to solve it,” Smith said. “We’re going to do this.”
Smith has promoted the idea for years, even including it in a graphic novel, “The End of Killing,” that shows a drone zapping a gunman rampaging through a day-care center. And in a question-and-answer session on Reddit the day after the announcement, Smith said he knows the idea might “sound nuts” but that it offers some benefits over “today’s solution” for responding to shootings: “a local person with a gun.”
The shock drones, he said, would be installed in ceiling-mounted “launch stations,” like smoke detectors, and be shielded to prevent “kids throwing stuff at” them. Schools, he said, could install “simple, low-cost vents” above doors to allow the drones to fly into locked rooms, though he also acknowledged that idea could raise “some fire code issues” due to smoke ventilation.
The drones could fire a payload of up to four shock probes more than 40 feet, he said, and deliver a sustained current of electricity to incapacitate an attacker long enough for people nearby to kill them or take their gun. The drones would be small and hard to shoot at, he wrote, and “after running out of darts, we could ram the drone into someone to physically distract” them.
Schools or police agencies, he said, would pay an estimated fee of about $1,000 a year per drone, and the company would only sell them in markets where “they would not be used abusively.”
The Federal Aviation Administration in 2018 banned anyone from flying a drone with a dangerous weapon attached. But Smith said such “legal limitations” could be resolved over time; Taser weapons and body cameras, he noted, had also been illegal in some states before Axon had started marketing them.
The company has “a long history of working in situations where the laws didn’t support our tech — and then did when people understood what we were trying to do,” he wrote.
In the Reddit session, Smith was asked how he would handle pushback from parents not wanting flying shock machines near their children. “Many parents would likely find this situation more comfortable than an armed guard stationed at the school,” he said.
But the response on Reddit was scalding. Some commenters worried that the drones would be misused to punish students, break up fights or police protests, or that they’d lead to unintended consequences, like more people getting shot after the gunman was shocked.
Others questioned whether Axon is capitalizing on the emotion of the moment to attract investors or sell a product. They also said the proposal is a sad commentary on America’s feeble response to a national crisis.
“The fact that we’re brainstorming about drones in schools, whether the motivation is capitalism, parental-instinct, or both, means our society is already pretty ill,” one commenter wrote. Wrote another: “We sure do love addressing symptoms instead of root causes don’t we.”