A concept image of a taser drone (IMAGE: Axon)

Exclusive: Axon still wants to put Taser drones in your kid’s school

This week, Axon, the company that developed the Taser, is hosting a conference in Las Vegas called TaserCon. The event is billed as an opportunity to talk about law enforcement and public safety and Axon was expected to use the occasion to reintroduce a controversial plan: putting stun-gun-equipped drones in police departments and schools to prevent mass shootings.

In fact, on stage during his keynote address, Axon’s founder and CEO, Rick Smith announced partnerships with a roster of drone companies and demonstrated some new drone technology.

Last summer, Smith publicly floated his weaponized drone plan saying it could help prevent shooting events like the ones in Buffalo, New York, and Uvalde, Texas. “We need different, better solutions, including ones that leverage technology to protect our schools, teachers and students,” Smith said at the time.

Weaponized drones were something Axon had been focused on for some time. More than a year before the Uvalde shooting, Axon had asked its ethics board to conduct an evaluation of “Project Ion,” a narrower program that envisioned Taser-equipped drones as an alternative to firearms when proximity to a shooter would endanger police. 

The company’s AI Ethics Board considered the plan for more than a year and even developed a possible pilot before recommending that Axon abandon it. The board concluded that Axon’s proposal for Taser-equipped drones could be easily abused. Weaponized drones in schools were not part of their evaluation, and in response to Smith’s announcement nine of the 12 board members resigned, saying his latest proposal took them by surprise.

Click Here has obtained a report that provides new details about the board’s deliberations around Project Ion. Written by former members of the Axon ethics board and the Policing Project at New York University School of Law, it lays out precisely why the board advised Axon to shelve the proposal. 

Among their concerns was the fact that the drones would actually increase the use of overall force and that deploying Tasers from such a distance had the potential for dehumanizing the people it would target. 

In a school setting, these concerns were only amplified.

“Axon’s focus on deploying this technology in schools was particularly concerning,” the introduction to the report warns. “Not only would Axon’s plans entail the installation of persistent surveillance (a form of surveillance the Board had long cautioned Axon against), several members of the Board believed weaponized drones stood little chance of solving the problem of mass shootings.” 

Rules for technology

The introduction of ethics boards grew out of the general understanding that in order to set rules for technology there needs to be a discussion among the people who make the technology, the people who will be affected by it, and the people who will use it. And it was from that belief that Axon’s AI ethics board was established. 

Artificial intelligence software allows drones to perceive their surroundings, map areas, track objects and provide analytical feedback in real-time – all of which can be very helpful to law enforcement. 

The problem is that AI comes with a roster of ethical risks, from biased algorithms that tend to disproportionately single out black or brown people to potential infringements on privacy because of a drone's all-seeing eye. What’s more, AI devices tend to be built to operate at scale so that any problems or biases that might be in the software affect scores of people, all at once.

This is one reason why Axon, a leading manufacturer of policing technology, decided to create an independent review board that would advise the company on how to develop AI-powered products without trampling on civil liberties.

“Axon was operating in a space that was fraught and they thought it would be good to have an independent outside body that would guide them,” said Barry Friedman, a founding member of the board and a law professor at New York University and the director of their Policing Project. “We were a little skeptical of it, frankly, but we agreed to listen – and I think over the period of time in which the ethics board was operating and operating well, everybody felt that good work was being done.”

Good work like convincing Axon that adding facial recognition software to its police body cams could end up misidentifying suspects — particularly women and people of color. The board also raised concerns about Axon’s plans to deploy high-speed license plate scanners, which could be misused to track people. The company eventually agreed to modify the way they deployed it.

So when Axon’s chairman asked the board to take a look at the company’s proposal for Taser-equipped drones, known as Project Ion, the board assumed it would end up as another instance in which they could prevent a technology from getting ahead of itself.

“Weaponizing drones and robots has been a frontier, right?” Friedman said. “And so the question was, is that a line that we just don't wanna cross?”

According to the report, the 12-member board quickly agreed that there were certain use cases that were just non-starters such as crowd control at protests or patrolling the border. “The drone has value as an eye in the sky,” Friedman said. “That in itself is incredibly invasive, but to then think that’s going to be able to zap people that it sees, that’s disturbing science fiction. On some level it conjures up images from Star Wars or something. It's not so good.”

Dallas, 2016

Back in 2016, the Dallas police ended a standoff with a gunman suspected of killing five police officers with an unprecedented tactic: they blew him up with a robot.

After hours of negotiations and an exchange of gunfire, Dallas Police Chief David O. Brown decided to attach an explosive device to one of the department’s bomb-defusing robots. “Other options would have exposed the officers to grave danger,” he said at the time.

But the decision was seen by critics as dangerously blurring the line between policing and warfare. To some, it made clear that the decision on who would be targets of some new technology could be purely subjective: Chief Brown decided to send in the robot. Another police chief somewhere else might not have.

Axon’s AI Board, according to the report, didn’t dismiss the idea of weaponized drones out of hand. Instead its members seemed to embrace the idea that there might be some very limited cases in which police could find drones helpful – specifically situations like the one in Texas. 

“We unanimously understood why you might want to be able to use less lethal force from a distance to save lives,” Friedman said, recalling the deliberations. “And the police on the ethics board told stories about horror shows they'd experienced in their departments where somebody had been shot and their lives might have been saved with something like this.”

But the board was also concerned about the social context in which stun-gun drones would be placed. “You can't just develop the technology with some safeguards,” said Max Isaacs, staff attorney at NYU’s Policing Project and one of the people who helped the board with its assessment. “You need to look at the types of policies that agencies are enacting, the way that they're enforced, the way that officers are trained, all of the pieces kind of combined.” 

That’s why, he said, the discussion wasn’t just about weaponizing the drones, but about Tasers themselves. And while these non-lethal weapons are used in some 94 percent of the nation’s police departments, they are not without controversy.

Isaacs said the Policing Project had gathered evidence of Tasers being used against school children, the elderly and even people who were restrained in handcuffs. “And given that context, given the fact that the Taser exists today, and we haven't found a way to prevent these abuses from happening, those concerns are only magnified if you put a Taser on a drone.”

Back in 2017, Cincinnati police fired a Taser at a young black teenager named Dawson Lee. He was 14 years old and fled when officers stopped him in connection with a stolen vehicle case. (Lee wasn’t involved). Body camera footage captured a subsequent chase and Lee falling to the pavement, breaking his collarbone. The officer used his Taser on Lee twice, and the Lee family ended up filing an excessive force lawsuit against the Cincinnati police. They eventually settled for an undisclosed sum. The police did not have to admit to any wrongdoing. Cincinnati police did not immediately respond to Click Here’s request for comment.

“I understand you gotta protect and serve and go home to your family,” said Diondree Lee, Dawson’s father. “But every 14-year-old African-American that you run into is not used to you doing them the way you did my child. That was his first encounter and only encounter.”

The incident also points to an issue Friedman said came up often in deliberations: “The idea of a control panel with a Taser drone felt a little bit too much like a gaming platform and with 18,000 law enforcement agencies in the country, as you might guess, there's a wide variety of quality.”

In the end, that was one of the concerns they just couldn’t overcome. “There was a group of us that were just concerned that as well as we could design this, as much as we believed it was something the world could benefit from, we couldn't trust the overall variance in policing to make this a commercially viable product.”

So after a year of study and deliberation, Axon’s AI ethics board took a vote. Barry Friedman chaired the meeting. “It was easily our most fraught ever,” he said. “And I think it was fraught in part because all of us understood the compelling use case. We all kind of went around the room and said how we felt. And then we went back and forth. And then after that we took a vote.” 

It was 8-4 against developing a Taser-enabled drone program. “We decided not to go ahead with the pilot and it was a tough decision,” he said, adding that Smith, the CEO, was disappointed. “This has really been one of his dreams, I think.”

But according to Freedman, the board never had a chance to explain to Smith why they came to the conclusion they did.

Buffalo and Uvalde

A few weeks later, in two separate incidents, there were mass shootings at a supermarket in Buffalo, New York, and at a school in Uvalde, Texas. That is when Smith announced the company would be developing a Taser drone for schools and laid out in a video how it would work.

“A human operator gives the go signal, then the drone rotors up, it immediately deploys into the seam, and together with the operator it can help identify the threat and there it could incapacitate that threat,” he said in an Axon video released at the time. “I believe this is how we can end school shootings.”

Mecole Jordan-McBride is a community organizer in Chicago and was a member of the Axon ethics board. She was one of the four members of the board who voted to allow the Taser drone pilot to move forward. She said she was stunned when Smith decided to announce a plan to deploy weaponized drones in schools. 

We asked Axon for an interview with Smith for this story, which the company declined. 

“It just felt fanciful to me and it felt like an emotional response that wasn't completely thought through,” Jordan-McBride said, adding that she didn’t know how it would even work. “I was thinking like how many school buildings are in Chicago, it's impossible for you to know which school would potentially be a victim to this, right? And so now are we talking about literally putting a drone in every single school across America? I thought about the amount of money that would be. I thought about the over surveillance of that.” 

To figure out where a shooter was, Axon would either need access to any cameras that were inside the school or would need someone to install them. In his video laying out the plan, Smith said Axon was “working through partners [would] activate any camera in any school, church, or public building so that it can be easily shared with first responders,” and then pre-placed drones would “stop threats in less than a minute.”

But if one thinks about how school shootings typically unfold, a gunman enters a school armed to the teeth, typically wearing body armor. (Tasers can’t penetrate body armor and sometimes have trouble even penetrating clothes.) The shooters often barricade themselves in a classroom, as happened in Uvalde. In that case, police thought it was too dangerous to go into the classroom where the shooter was so they waited down the hall for 70 minutes.

“Drones have to get around and shooters go into rooms and close doors,” Friedman said. “And the company's answer was, well, we'll just cut holes in all the doors so the drone could get through. I mean you're trying a little too hard.”

For Axon’s ethics board, which only weeks earlier had advised against a smaller, more controlled Taser pilot program, Smith’s school drone program demanded a reaction. “We could have talked as a board and we could have talked with the company, but there was this great eagerness on Rick's part to get this idea out there in the aftermath of Uvalde and we just couldn’t operate that way.”

Nine members of the board resigned and it was eventually disbanded. Friedman said he felt like there wasn’t a choice. “Trust was so much of the work that we did,” he said. “Board members were often attacked from the outside world for, you know, working with Axon. And it's interesting because in the aftermath of the collapse of the board, I got a number of emails from folks in different places in the civil liberties and racial justice community saying it's a good thing that you did the work and it's a good thing that you stopped when you did, given the circumstances.”

After the board resigned, Smith said he’d put the whole stun-gun Taser project – both the pilot and his effort to put drones in schools – on hold. But this week, Axon is set to host a session at the TaserCon conference called “Weapons of Mass Construction” and drones in schools are expected to be part of that discussion. 

Smith’s keynote address at TaserCon on Tuesday also included a presentation on drones, and the CEO said Axon has already partnered with multiple drone companies.

The company told Click Here in a statement that Taser drones in schools is an idea, not a product, and it’s a long way off. In the meantime, Smith has said publicly that he’s engaging with teachers and school boards and continuing to explore the idea.

“I know that they have a new ethics board or something like the ethics board,” Jordan-McBride, the former board member, said. “My hope and my prayer is that those individuals are asking the hard questions, and that their design team is really pushing back and trying to answer for all of these what ifs.” 

The original ethics board report, which addresses those ‘what ifs,’ is expected to be publicly released this week.

Additional reporting provided by Sean Powers and Will Jarvis.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles
Dina Temple-Raston

Dina Temple-Raston

is the Host and Managing Editor of the Click Here podcast as well as a senior correspondent at Recorded Future News. She previously served on NPR’s Investigations team focusing on breaking news stories and national security, technology, and social justice and hosted and created the award-winning Audible Podcast “What Were You Thinking.”