Readit News logoReadit News
theptip · a month ago
> So we don’t store the original draft and that’s by design and that’s really because the last thing we want to do is create more disclosure headaches for our customers and our attorney’s offices

You have to wonder if this will stand up in court. I hope not.

AI has a great opportunity to take processes that contain hidden bias and make them more legible and therefore amenable to fixing.

But it also has the opportunity to do the opposite, and we should be cautious to make sure guardrails are in place when putting this tech into life-and-death systems.

“Stamp this LLM text in a hurry” is an invitation for whatever errors and biases are baked into the system to be propagated. You need provenance and measurement of LLM outputs.

brookst · a month ago
Yeah the avoidance of record keeping to reduce disclosure smacks of the policies that got Google into hot water recently: https://www.epspros.com/news-resources/news/2024/google-accu...
moron4hire · a month ago
> sign an acknowledgement that the report was generated using Draft One and that they have reviewed the report and made necessary edits to ensure it is consistent with the officer’s recollection.

We already know that police officers are not more reliable than the general public as eye witnesses and that eye witness reports are generally very unreliable as they are very susceptible to prompting bias. This seems like leaning in to prompt bias. The AI is now prompting the human rather than the other way around. This is perverse.

UncleEntity · a month ago
No doubt.

I was watching one of those youtube bodycam videos of an accident scene where one of the cars ended up in a gas station. Police show up and it's chaos -- victims on the ground needing medical attention, witnesses helping (or not) said accident victims, police not knowing who was in what car, &etc.

In the midst of all this (when it calmed down enough for the police to get a handle on the scene) they tried to identify someone who didn't want to be involved and promptly cuffed them and threw them in the back of a squad car for "being uncooperative". One of the other witnesses, having seen this, decided that person was the missing driver of the other car and told this to the police with all sorts of confidence.

Now the police have a 'suspect' to concentrate on because anyone 'acting squirrelly' must have something to hide as it's totally inconceivable to them someone might just not want to participate in their investigation. Luckily this poor, traumatized kid was able to 'prove' they weren't involved before spending who knows how much time behind bars based on 'credible' eye-witness testimony.

These audio-only AI generated reports should be all kinds of accurate now that police are trained to say 'quit resisting' anytime there's any level of force involved specifically for the body cams...

Workaccount2 · a month ago
I wondering how much this even matters in the age of everything being recorded.

If they are using axon body cameras and vehicle cameras, then usually the entire interaction is recorded, often from multiple officers.

I cannot imagine a defense so incompetent that they rely on the police report rather than watching the entire body cam footage and doing their own assessment.

Even if the cops are doing something sketchy (like turning off their camera) then it's not like the police report would be any more trustworthy.

notaustinpowers · a month ago
The current administration has already removed the requirement for federal police forces to wear body cameras. As well as made statements (but little action so far) to federalize the police force to be under the jurisdiction of the DOJ. Everything being recorded may not be the case very soon. Sorry, I’d get sources but I just woke up, I’ll edit this later with them.
hughesjj · a month ago
I mean, already at the local cop level "forgetting" to turn the body cam on or only releasing the video (at least, quickly) if it puts the officer in a positive light seems to be the norm

* https://www.pbs.org/newshour/nation/officers-body-camera-wen...

* https://www.nbcmiami.com/investigations/body-cameras-turned-...

* UK but it's the same discussion https://www.bbc.com/news/uk-66809642

* https://www.wbrc.com/2025/07/12/coroner-completes-report-jab...

* https://ktla.com/news/nationworld/release-of-police-bodycam-...

jameshart · a month ago
If it’s not being recorded, what would this AI summary be based on?
avs733 · a month ago
It goes a lot deeper than this, the real world isn't as simple as 'objective truth' and much of the law relies on interpreting the facts we all seek. This is where this technology fails, it normalizes nudging the margins to include a framing of what happened (including that video) using particular and precise language. That language influences court decisions.

For example, the phrase 'furtive movements' seems really anochronistic. Is that a phrase you use? cops use in their day to day life? But it constantly shows up in police reports. Why? The courts have said that 'furtive' movements are suspicious enough to trigger probable cause - which justifies a search. So now, cops every where write that they observe movements that are furtive. Is what your attorney viewed furtive? where they normal movements? were they suspicious? The cop described them as furtive though and we defer to cops, in part because they speak the language of the courts, and now your arrest is valid and that search is valid and whatever is recovered is valid - because a court said movements need to be furtive and you sneezed and a cop described that as furtive even though he had already decided to do the search before he got out of his car.

The only way our system works is if at every level every participant (people, jurors, judges, politicians) distrust the words of police - especially when they habitually use the language of the law to justify their actions. What this tool does is quite the opposite, it will statistically normalize the words police use to describe every interaction in language that is meant to persuade and influence courts now and over time to defer to police.

https://www.bjjohnsonlaw.com/furtive-movements-and-fourth-am...

https://www.californialawreview.org/print/whack-a-mole-sus

axus · a month ago
I was thinking the same thing. If the AI report depends on the raw audio, then it should be preserved and the defense should compare that to the final police report. Having the edit history would be useful for improving the software and analyzing the officer's motivations, but ultimately we're not in a worse situation than before.

I'd predict the synthesis of the AI transcript and the police officer's memory will be more accurate than just the police officer alone. Would be nice if there's an independent study.

There are very incompetent public defenders, if we attribute to incompetence instead of malice, AI isn't changing that.

advisedwang · a month ago
Body cameras capture a lot less than you'd think. Lets just take one example: you can't see the cop at all.

Imagine a cop says "stand over there" but a suspect doesn't know where they mean and doesn't move. Axon might well add "I pointed to the kerb" on the police report, which the police officer rubber stamps. Now the system has invented a clear non-complying suspect with nothing but their word to refute it.

Similarly shaky camera, limited field of view, poor audio, periods when cameras are off all add opportunities for camera footage to fail to prove what happened. Police reports have a lot of weight in court, they are treated as true by default.

anigbrowl · a month ago
I cannot imagine a defense so incompetent that they rely on the police report rather than watching the entire body cam footage and doing their own assessment.

Not incompetent at all. Police officers often turn in multiple reports following an incident and arrest. Sometimes they contradict each other, and it would be a foolish defense attorney who did not explore those contradictions.

qingcharles · a month ago
Even in jurisdictions that require recordings at all times, there are times when the police are required by law to switch them off (entering certain non-public spaces etc), so there can always be gaps that are legal, never mind illegal.
mrgoldenbrown · a month ago
This article is about Axon purposely working to make audits harder and police less accountable. Is it so hard to imagine a future feature that summarizes the body cam footage then erases it?
mycall · a month ago
It should matter which parts were written by AI or by officer. Once the officer signs off on the report, they take full responsibility for the content.
conartist6 · a month ago
I can only assume you meant to write "shouldn't" instead of "should", but if you study human factors you'll discover that certain kinds of taking-shortcuts behavior are inevitable when dealing with humans. Speeding when we drive, for example. We know we are creating a material risk of getting pulled over and fined, but we just basically decide to ignore that risk because for most of us it is outweighed by the convenience (and real value) of getting everywhere we're going faster.

As always considering how a person would interact with an intern is surprisingly instructive to how they will form a working relationship with an non-sentient tool like a language model. You would expect them to give it a probationary experience to earn their trust after which if they are satisfied they will almost certainly express that trust by giving the tool a greater and greater degree of freedom with less active (and less critical) oversight.

It is not the initial state that worries me where the officers still mistrust a new technology and are vigilant of it. What worries me is the late-stage where they have learned to trust it (because it has learned to cover their asses correctly) and the AI itself actually ends up exercising power in human social structures because people have a surprising bias towards not speaking up when it would be safer to keep your head down and go with the flow, even when the flow is letting AI take operational control of society inch by inch

mycall · a month ago
Yeah, shouldn't :)

Taking your point with speeding, it is government policy for public transit bus drivers to speed +15MPH over the limit to stay up with traffic because it is safer, even though it is technically the government breaking the law.

You have the same worries as I regarding AI taking operational control which is likely inevitable at this point.

827a · a month ago
Yeah; Law Enforcement & the Judiciary is going to be an early flashpoint in the "a computer must never make a management decision" conflict. IMO: Its actually really important that these systems do not explicitly mark content as AI generated versus not, because if the operator is going to be held responsible for the final content, we can't allow repudiation of that content through an argument "well, that part was AI generated". Even if it isn't written with their voice; it should be reviewed and accepted by them, and at that point it doesn't matter if the AI writes it.

This is a contrived metaphor, but imagine some case report makes its way to a judge, and its missing some significant details about the case. But, its hand-written, and the officer argues that hand-writing is physically harder than typing, so of course hand-written reports wont be as comprehensive as typed ones. That argument is insane, partly because its an imperfect metaphor, but the line of logic is there.

advisedwang · a month ago
There is no accountability behind that responsibility.

Cops are not held to account for lying now. Even when they are caught, 99% of the time the worst consequence for them is their testimony is ignored in court. They don't face professional repercussions in practice.

So even if officers are responsible for their reports they will still take the easy route and sign off on AI garbage. There is no downside, and it helps them meet pressure from their bosses and avoid the part of the job they hate the most.

zdw · a month ago
Do you read EULAs all the way through every time?

People just LGTM rubber stamp nearly everything they're given, as it's time efficient in the now.

tqi · a month ago
Do you think there is a difference between a civilian driver ignoring the routine maintenance schedule for their car and a professional pilot ignoring the maintenance schedule for their plane?
9dev · a month ago
That’s absolutely their choice, then. But if it turns out the AI wrote bullshit into the report, the officer that rubber stamped it must be held accountable for that, with no difference to a situation where they had written the bullshit themselves.
asah · a month ago
The real issue is accountability - officers need to be held accountable for reports the way pilots are accountable for use of auto-pilot[1].

[1] yes they are: https://www.google.com/search?q=are+pilots+accountable+for+u...

hollywood_court · a month ago
Law enforcement needs greater accountability altogether.

I’ve long believed that police officers should be required to carry private liability insurance, just like professionals in many other high risk fields. If an officer is uninsurable, they should be unhireable, plain and simple. Repeated misconduct would drive up their premiums or disqualify them entirely, creating a real consequence for bad behavior.

It’s astonishing that police officers aren’t held to the same standards as the rest of us. As a carpenter and building contractor, if I showed up at the wrong address and built or tore down something by mistake, I’d be financially and legally responsible. I’d be expected to make it right, and my insurance would likely step in.

But when a police officer raids the wrong home, injures or kills innocent people, or throws tear gas into a room with a baby, there’s rarely accountability—legal, financial, or professional. That’s unacceptable in any system that claims to serve and protect the public.

pjc50 · a month ago
> But when a police officer raids the wrong home, injures or kills innocent people, or throws tear gas into a room with a baby, there’s rarely accountability—legal, financial, or professional. That’s unacceptable in any system that claims to serve and protect the public.

The American public, or at least the set of them whose vote counts among the gerrymandering, have explicitly chosen this. Their representatives are now building an even less accountable system to be used against "immigrants", i.e. anyone non-white, who can be abducted and denied legal representation.

FireBeyond · a month ago
> But when a police officer raids the wrong home, injures or kills innocent people, or throws tear gas into a room with a baby, there’s rarely accountability—legal, financial, or professional.

It's not just that there's rarely accountability - there's explicitly no accountability.

People have sued officers, police departments, cities for the cost of damages from such mistaken raids (including ones that were completely negligent, like wrong street entirely) and the courts have explicitly ruled that they have zero reponsibility to pay for any of the damage caused.

Spooky23 · a month ago
That’s a dangerous slippery slope. Most public officers (employees) are subject to a wide range of ethics and other regulations that impact post-service employment. In exchange, you’re indemnified for official acts and the government has a duty to defend you.

I’ve served in policy making roles at different levels of government. There’s a variety of businesses post employment that I’m not permitted to enter in post employment, some for 2-5 years, some indefinitely. Those restrictions are taken seriously, and I know that I’ll be held accountable.

Putting the onus on the employee is really enabling bad behavior - the issue is the poor governance of the police, and using the courts as some sort of cudgel won’t fix it, it will just create more corruption as the powers that be will hang out patsies to take the fall.

If the police are allowed to operate paramilitary forces, they need paramilitary discipline and rules of engagement. Army soldiers breaking rules of engagement get punished and officers sidelined and pushed out of the service. Police in many cases have been allowed to create cultures where everyone scratches each others back. Many police are veterans, and many privately will comment on the differences between those experiences.

IMO, the way to address the issues you describe is standard separation of duties. Invest in state and regional police forces, disempower local police, and move enforcement and investigation of police to a chain of command removed from the police. (Perhaps a State AG) When you need to blunt the variance associated with people’s poor application of discretion, the answer is usually a bureaucratic process.

tbrownaw · a month ago
> I’ve long believed that police officers should be required to carry private liability insurance, just like professionals in many other high risk fields. If an officer is uninsurable, they should be unhireable, plain and simple. Repeated misconduct would drive up their premiums or disqualify them entirely, creating a real consequence for bad behavior.

And it'd be administered by some faceless bureaucracy full of accountants, rather than a couple local politicians that the union can just bully (or bribe or whatever) into ignoring things.

But of course the current mess derives from sovereign immunity, which might be a bit tricky to get the politicians to tinker with more than they already have. :(

potato3732842 · a month ago
>Law enforcement needs greater accountability altogether.

Which is directly contrary to what's good for the state. So unless the lack of accountability is more threatening to the state than less useful to the state law enforcement is the lack of accountability will remain. At best it might get slightly better over decades.

moron4hire · a month ago
Politically, you could probably sell the insurance idea as actually protecting officers. But then you'd get the wrong people opposing it...
barbazoo · a month ago
Chesterton’s fence cones to mind. I wonder what unintended positive effects the current policy has.
hxtk · a month ago
I really wish policing would take more inspiration from aviation on a different avenue for police accountability.

The NTSB exists not to blame pilots (though they sometimes do), but to make air travel safer and prevent future plane crashes. In the business of preventing disaster in safety-critical industry, if you chalk something up to human error or call it a tragic accident, you guarantee that it will happen again. Finding that everyone did everything by the book means the book needs to be rewritten because the book that exists today contains a recipe for plane crashes.

I wish police would treat use of force incidents the same way. The investigations after police use of force ask whether the officer violated the law or department policy. Like most law enforcement and judicial work, the exercise focuses on identifying, trying, and punishing guilty parties. If there is no guilty party, the process can produce no change. I would like to see more investigations into police use of force that focus on improving safety outcomes instead.

KingMob · a month ago
> I wish police would treat use of force incidents the same way

Cops have the system they want. They won't voluntarily change it unless forced.

pbronez · a month ago
Yes. It doesn’t matter exactly how each word in the police report was entered. All that matters is the officer signed off on it. They should be personally & totally responsible for the contents of the report. I don’t care if they use generative AI, speech to text, Dvorak touch typing, QWERTY hunt-and-peck or anything else. An officer must read the final report and sign to assert its accuracy.

If police reports are low quality, it’s an officer performance problem. Obviously performance management in public safety is exceptionally challenging, but that’s the problem domain that matters. You cannot solve law enforcement accountability by tweaking your AI User Interface.

That said, this seems like a missed opportunity to use technology to increase accountability. If you’re running speech to text on body cam footage, great! Everyone involved in the conversation should get a copy of the transcript. There should be a straightforward way to challenge STT errors.

Again though, it’s the same deal as the body cam footage itself. Always-on body cams with default public access are one thing, officer-managed, sue-to-review is quite another. The crucial issues are political, not technical.

adriand · a month ago
That’s really just one issue among many, and it actually makes me worry more about this technology, not less: it provides a clear incentive for the officer to stand by the contents of a report that he or she did not write, even if they realize at some point it is wrong, because they hastily or lazily signed it.

The way this technology is designed is a clear example of dystopian outcomes driven by market forces: capitalism inserted into processes (like justice) which society ought to protect against perversion by profit motives. I can imagine a version of this technology that is designed with societal benefits in mind, but instead we get one designed to make the sale.

qingcharles · a month ago
Here's the thing. I've read thousands of police reports. Most police reports are super short and super vague. Most police reports are never read even once, even though a large percentage result in convictions. Most criminal charges result in plea deals. Most defendants will never see any evidence against them before pleading guilty†.

If, in the exceptionally rare case that a defendant goes to trial, an officer has to testify, it is probably on average a year after he wrote the report. He will be sat down just before trial by the prosecutor and shown his report and asked to read it. On the stand he generally will not have his report available to reference and is supposed to use his memory, but this will be corrupted by his reading of whatever is in the report he read a couple of hours before. If his report is full of inaccuracies he will almost certainly testify under oath to those.

†This situation has changed very slightly in the last few years with lawyers now supposed to verify the probable guilt of their client before recommending a guilty plea.

tehsolution · a month ago
The solution to police state is more policing?

What about less? Take away guns and reach of the cops and politicians?

Accountability by making 900k cops across all levels of government stripped of power and made normal people? Same for the 600k politicians coast to coast. Screw their story mode mental illness.

Make everyone busy generalizing logistics process to serve biology and stop with story mode hustling memes about fiat (vacuous proclamations) valuations using jargon from the 1800s?

Roughly 1.5 million pols and cops have 10s of millions wrapped around their finger. With urbanization the best part is a bunch of them live just a few miles from any given large urban area full of people being screwed by them.

The time for demanding meager reforms from 60+ year olds who have no skin in our future is long gone.

Skip the guns and go the route of making everyone a normie civil servant and no one has leverage https://aeon.co/essays/game-theory-s-cure-for-corruption-mak...

Except the low level gossipy kind like “so n so cheated”. Statistical analysis of death trends suggest we kill each other on Main Street over such gossip at the same rate humans did centuries ago. It’s those moments of nation state fueled atrocity and imperialism when human death spikes. Seem clear in the streets most adults just don’t go on murderous rampage.

ta8645 · a month ago
That's incredibly naive. Spend some time watching police body camera footage. By and large, the police are doing exceptionally well in hostile and difficult circumstances. We're all safer because there is a real counterforce to tough guys, mafias, and paramilitary strong men. They don't exist or are heavily controlled, because the police are a powerful force for good. Taking away the police would create a power vacuum that would be filled faster than you can imagine; and by people who will treat us all much worse than the police ever have.
patrickhogan1 · a month ago
This requires audio to work and appears to create more transparency. You can request the audio recording to verify accuracy. This will happen as a routine procedure from defense attorneys. Any problems with the technology would be discovered quickly and if the officer didn’t do their job of correcting the errors before the report is generated they would be torn apart.
advisedwang · a month ago
You can verify the accuracy of a transcript, but a police report is much much more than a transcript. It includes off-camera action, smells, states of mind, impressions, context, facial expressions, etc. If Axon hallucinates that stuff, there's no way you can use video footage to check the accuracy or refute the statement.
troupo · a month ago
When EU introduced its AI Act there was much gnashing of teeth here at HN over "stifling of innovation" and "getting left behind in technological backwater".

EU AI Act specifically calls out and forbids such applications. Of course, the state will do what the state will do, but there's an actual obstacle enshrined in law.

rdm_blackhole · a month ago
And there is the right to privacy enshrined as well but that has no bearing on what states will do ultimately.

The EU is in the midst of ending encryption and will soon require lawful access to all your data by forcing providers to bake in legal backdoors in OSes so that nobody can bypass/deactivate them.

All of this done under the guise of protecting the children, stopping misinformation(the ministry of truth is back) and protect democracy (TM).

The AI act may be a good thing in some cases but we should all stop pretending that the EU is not following in the footsteps of the US when it comes to loss of privacy and restriction of freedom of speech.

Many western countries are slowly sliding into wannabe authoritarian regimes.

squirrel · a month ago
Creative lawyers will be all over this. First you get the officer to testify that AI helped write the report, then you call the AI as a witness. When the judge tosses that, you start issuing subpoenas to everyone you can find at OpenAI and Axon.

As others point out, the actual bodycam footage will be definitively probative for the events it records. But there are plenty of cases where the report itself leads to later actions that may be tortious or criminal, and finding out who's to blame for the exact wording used is highly relevant.

Example: AI incorrectly reports that during A's arrest, A made incriminating allegations about B. Based on the report, the police get a warrant and search B's house. When it turns out B is innocent, B sues the department, and when the report turns up during discovery, we're off to the circus.