American cops are using AI to draft police reports, and the ACLU isn't happy

Do we really need to explain why this is a problem?

Comment AI use by law enforcement to identify suspects is already problematic enough, but civil liberties groups have a new problem to worry about: the technology being employed to draft police reports.

The American Civil Liberties Union published a report this week detailing its concerns with law enforcement tech provider Axon's Draft One, a ChatGPT-based system that translates body camera recordings into drafts of police reports that officers need only edit and flesh out to ostensibly save them time spent on desk work. 

Given the importance of police reports to investigations and prosecutions and the unreliability already noted in other forms of law enforcement AI, the ACLU has little faith that Draft One will avoid leading to potential civil rights violations and civil liberty issues. 

"Police reports play a crucial role in our justice system," ACLU speech, privacy and technology senior policy analyst and report author Jay Stanley wrote. "Concerns include the unreliability and biased nature of AI, evidentiary and memory issues when officers resort to this technology, and issues around transparency. 

"In the end, we do not think police departments should use this technology," Stanley concluded. 

It's worth pointing out that Axon doesn't have the best reputation when it comes to thinking critically about innovations: Most of the company's ethics board resigned in 2022 when Axon announced plans to equip remote-control drones with tasers. Axon later paused the program following public blow-back. 

Draft One, however, has already been in the hands of US law enforcement agencies since it was launched in April. It's not clear how many agencies are using Draft One, and Axon didn't respond to questions for this story.

We can't even trust AI to write accurate news

This vulture can personally attest to the misery that is writing police reports. 

In my time as a Military Policeman in the US Army, I spent plenty of time on shifts writing boring, formulaic, and necessarily granular reports on incidents, and it was easily the worst part of my job. I can definitely sympathize with police in the civilian world, who deal with far worse - and more frequent - crimes than I had to address on small bases in South Korea. 

That said, I've also had a chance to play with modern AI and report on many of its shortcomings, and the ACLU seems to definitely be on to something in Stanley's report. After all, if we can't even trust AI to write something as legally low-stakes as news or a bug report, how can we trust it to do decent police work?

LLMs, while amazingly advanced at imitating human writing, are prone to unpredictable errors [that] may be compounded by transcription errors, including those resulting from garbled or otherwise unclear audio in a body camera video

That's one of the ACLU's prime concerns, especially given report drafts are being compiled from body camera recordings that are often low-quality and hard to hear clearly. 

"LLMs, while amazingly advanced at imitating human writing, are prone to unpredictable errors [that] may be compounded by transcription errors, including those resulting from garbled or otherwise unclear audio in a body camera video," Stanley noted. 

In an ideal world, Stanley added, police would be carefully reviewing AI-generated drafts, but that very well may not be the case. The report notes that Draft One includes a feature that can intentionally insert silly sentences into AI-produced drafts as a test to ensure officers are thoroughly reviewing and revising the drafts. However, Axon's CEO mentioned in a video about Draft One that most agencies are choosing not to enable this feature.

The ACLU also points out privacy issues with using a large language model to process body camera footage: That's sensitive police data, so who exactly is going to be handling it? 

According to Axon's website, all Draft One data, including camera transcripts and draft reports, are "securely stored and managed within the Axon Network," but there's no indication of what that network entails. Despite Microsoft's insistence that police aren't allowed to use Azure AI for face recognition, that apparently doesn't apply to letting an AI write police reports, as Axon indicated in an April press release that Draft One "was built on top of Microsoft's Azure OpenAI Service platform."

Not exactly confidence inspiring given Microsoft's and Azure's security track record of late.

"When a user (such as Axon) uploads a document or enters a prompt, both of those are transmitted to the LLM's operator (such as OpenAI), and what that operator does with that information is not subject to any legal privacy protections," the ACLU report states. 

"Axon claims here that 'no customer [ie, police] data is going to OpenAI,' but normally in order to have an LLM like ChatGPT analyze a block of text such as the transcript of a bodycam video, you normally send that text to the company running the LLM, like OpenAI, so I'm not sure how that would work in the case of Draft One," Stanley told The Register in an emailed statement. We've asked Axon where data is processed and stored, but again, we haven't heard back. If OpenAI isn't getting access, Microsoft may be, at the very least.

The ACLU is also concerned that using AI to write police reports lacks transparency, especially if the modified version of ChatGPT used as the basis of Draft One has system prompts instructing it to behave in a certain way, which it likely does like most LLMs. 

"That's an example of the kind of element of an AI tool that ought to be public," the ACLU report argued. "If it's not, a police AI system could well contain an instruction such as, 'Make sure that the narrative is told in a way that doesn't portray the officer as violating the Constitution.'"  

We've asked Axon for a look at Draft One's system prompts. 

AI reports could lead to more police dishonesty

"This elasticity of human memory is why we believe it's vital that officers give their statement about what took place in an incident before they are allowed to see any video or other evidence," the ACLU stated in the report. Draft One bypasses that issue by generating a draft report primarily based on audio captured by body cameras, which officers ideally should not rely on exclusively to provide their own testimony.

If an officer reviewing an AI-generated report notices, for example, that something illegal they did wasn't captured by their camera, they never need to testify to that fact in their report. Conversely, if an officer lacked probable cause to detain or arrest a suspect, but their camera picks up audio in the background that justifies their action, then post-hoc probable cause could again disguise police misconduct. 

"The body camera video and the police officer's memory are two separate pieces of evidence," Stanley wrote. "But if the police report is just an AI rehash of the body camera video, then you no longer have two separate pieces of evidence - you have one, plus a derivative summary of it."

Along with potentially assisting police to cover up misconduct or create after-the-fact justifications for illegal actions, the ACLU also pointed out another issue identified by American University law professor Andrew Guthrie Ferguson: It makes them less accountable for their actions. 

In a paper written earlier this year covering many of the same concerns raised by the ACLU, and cited as inspiration for its report, Ferguson pointed out that making police officers write reports can serve as a disciplinary check on their use of power.

Police have to justify the use of discretionary power in reports, which Ferguson and the ACLU pointed out serves as a way to remind them of the legal limits of their authority. 

"A shift to AI-drafted police reports would sweep away these important internal roles that reports play within police departments and within the minds of officers," the ACLU wrote. "This is an additional reason to be skeptical of this technology." 

At the end of the day, some police are using this technology now, though Stanley believes its use is likely confined to only a few agencies around the US. Axon isn't the only company offering similar products either, with Policereports.ai and Truleo both offering similar services. 

The ACLU told us it's not aware of any cases involving the use of AI police reports that have been used to prosecute a defendant, so we have yet to see these reports stand up to the legal scrutiny of a courtroom. ®

More about

TIP US OFF

Send us news


Other stories you might like