Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on June 8, 2020

Here are 5 ways to use AI as a ‘bad apple detector’ for cops


Here are 5 ways to use AI as a ‘bad apple detector’ for cops Image by: ArtOlympic/Shutterstock

When an apple begins to rot it creates a chemical called ethylene. If that apple happens to be in a barrel with a bunch of other apples, and the rotting causes its skin to break, the ethylene will immediately cause the other apples to start rotting. That’s why the proverb “one bad apple spoils the bunch” is meant as a warning. If you find one bad apple, all the apples around it are already rotting.

Obviously, the smartest thing to do is to locate, isolate, and remove bad apples before they can poison others. That’s pretty easy to do when the apples are literal, but what about when they’re a metaphor for systemic racism run rampant in the justice system?

The simple fact of the matter is that nothing less than top-down systemic upheaval at the grandest scale can solve the issues plaguing US law enforcement. While it might be a polarizing idea, abolishing the police may very well be the most logical resolution.

However, that kind of paradigm shift will take years. We’ll need interventions in the meantime. Whether you’re a die-hard police supporter or someone who thinks we don’t need paramilitary troops with badges patrolling US streets, we can all agree that the “bad apples” have to go.

Artificial intelligence provides several interventions we could deploy immediately. Here’s five quick, cheap, easy solutions we could implement:

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Chat bots

No, not the chat bots you see on social media. We’re talking about AI chat bots that use cognitive behavioral therapy to help people develop positive habits and routines surrounding their mental health. Like, for example, Woebot.

This form of therapy isn’t a replacement for on-the-job mental evaluations and personal therapy sessions, but instead would be used as a way to monitor officer’s mental states. It could be developed as a smart phone app and would only require officers spend five minutes at the beginning and end of every shift.

Coupled with relevant advice and mental health education, this simple intervention could help officers cope with the unique struggles and stress of their profession while also alerting department heads if an officer seems unbalanced or in need of human intervention.

Natural language processing

If we really want to root out bad apples, we’re going to need evidence of their crimes. Unfortunately, all we currently have are eyewitness testimony and body cams (when they aren’t malfunctioning or being intentionally turned off). One solution would be ubiquitous audio surveillance of cops. We could use natural language processing to record and transcribe everything a cop says or hears the entire time they’re on duty and then encrypt it so that it couldn’t be deleted and would be admissible in court.

Officers are often accused of using racist epithets in the line of duty. This would help their superiors and the public to determine if such accusations are warranted in instances where police aren’t caught on camera.

AI background evaluations

Despite the police’s incessant use of illegal surveillance equipment without legal warrant, we the people don’t actually keep tabs on the cops. In fact, once they’re hired, we tend to let the system handle its own. That’s why so many cops who’ve been fired from one department find immediate work in another. We need an AI system that can crawl through personnel files to make historical and predictive inferences. In other words, we need three algorithms crawling through every officer’s personnel file at all times:

  1. A historical algorithm that ensures officers aren’t continuing to work in law enforcement after being fired with any violations for violent behavior.
  2. A real-time algorithm that process police records for accuracy and relevance. This will ensure that all officer training is up to date and that files never go missing when allegations occur.
  3. A predictive algorithm that uses historical data from all police records to determine which active officers present a high-risk for violent behavior.

A social media monitor

More specifically, an FBI-backed system utilizing several AI models to monitor social media, the dark web, and officer counseling logs for white supremacist activity. The bulk of all reporting on law enforcement officers involved in racist organizations has come from journalists. And, despite an unwillingness from cops to police their own, the media has uncovered thousands in the past few years. A systemic evaluation conducted by a competent machine learning staff could potentially identify even more bad apples and, with the FBI and Justice Department’s help, attach names and files to them so they can be immediately dismissed and banned from law enforcement.

Facial recognition

Finally, we need computer vision services and facial recognition software to identify every single police officer caught on video perpetrating violence against protesters during the past two weeks. If we’re to move forward into an era where US police are not synonymous with white supremacy, violence against peaceful citizens, and lynching, they must be held accountable for their actions. Ensuring all the officers who used looting and riots as an excuse to unleash barbaric tactics against peaceful protesters are brought to justice should be the first step towards restoring the public faith in the future of law enforcement.

These suggestions are not panacea for the problem, but prescriptions that could aid in the transition from violent police rule to a state of peaceful democratic law enforcement.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with