Mozilla artists question whether AI could predict police killings

0
3

Jonathan Greig

By

Jonathan Greig

| October 14, 2021

| Topic: Artificial Intelligence

Two Black artists have released a new project asking a provocative question: What if we could predict the next police shooting?

With the help and funding of Mozilla, the two artists have released an innovative project called “Future Wake.” The interactive website takes on police use of predictive software by doing the reverse: using AI trained on real law enforcement data to predict future police killings.

The piece, done by two artists who asked to remain anonymous due to the sensitivity of the piece, features stories about fictional future victims of police violence in Chicago, Houston, Los Angeles, New York and Phoenix.

The use of artificial intelligence by police departments in the United States has caused controversy for years. AI, facial recognition software and predictive tools are now used widely across the country, sparking discussions about the role technology plays in police violence and the inherent biases in certain platforms. At least 650 people have been shot to death in the US by the police in 2021.

The site’s fictional AI system predicts the location and manner of future police killings. These predictions are presented in brutal detail: viewers see and hear from the fictional victim discussing their experiences and eventual death. 

The AI system trained on the fatal encounters and mapping police violence datasets “predicts” who in those cities is most likely to be killed by police, where they could be killed, and how they could be killed. 

The artists are quick to mention that Future Wake is an “art project that intends to stir discussions around predictive policing and police-related fatal encounters.” 

“The ‘Wakes’ are generated using statistics and Artificial Intelligence and are based on 20 years of historical data. The generated victims and stories are not real,” the artists say on the site.

untitled.png

Mozilla

The project was funded through Mozilla’s Creative Media Awards. The two artists from the Netherlands are part of a larger program that was announced in January called “Black Interrogations of AI.”

The program sought to help artists take on AI systems, saying the tools “can perpetuate and amplify biases that have long existed offline.”

“Recommendation algorithms promote racist messages. Facial recognition systems misidentify Black faces. And voice assistants like Alexa and Siri struggle to understand Black voices. As the AI in consumer technology grows more sophisticated and prevalent, problems like these will grow even more complex,” Mozilla said.

Tim, one of the artists behind the project who asked that his last name be withheld, told ZDNet that the concept for Future Wake was born out of their conversations around AI, predictive policing, technology and art. 

“Our intent is to present the artwork with a couple of layers of depth. The first layer contains visual storytelling about the future victims, accessible for most adults in and outside the US. The second layer is the concept of AI and reverses predictive policing for the more tech- and art savvy. The third is a deeper dive into the technology and politics of these subjects,” Tim said.

Tim said he hoped the project would prompt people to be more empathetic toward those killed by police and more critical of statistics released by police departments. 

He also wanted people to be aware of the predictive software being used by police departments and for there to be more urgency about stopping future deaths at the hands of police. 

When asked about potential criticism or backlash — particularly from Black viewers who may find this piece distasteful or insensitive to the actual Black lives that are lost every day to police violence — Tim said that energy should be redirected elsewhere.

“As black artists, we’re aware of this sensitive subject. We’re presenting this artwork as carefully as we can. Still, this is an artwork; people can like it or not. If someone is offended by our depiction of possible future victims, we think it’s better to redirect this energy to help combat this violence,” Tim explained. “This piece is not only about black victims; all demographics are represented in the database and predictions.”

Tim also lauded Mozilla for supporting arts and tech while facilitating conversations in their events and outlets. 

“They have a very empathetic, critical and explorative attitude that supports/empowers artists like ourselves. And with the help of Harmony Labs, we’ve had many kinds of workshops and conversations with artists inside and outside our cohort,” Tim said. 

He added that while the piece is currently only in website form, conversations about creating an offline version are being had. 

“Future Wake turns the application of predictive policing upside down. Rather than predicting crimes committed by the public, it focuses on future fatal encounters with the police,” the two artists said. 

“Rather than communicating the traumas of police brutality solely through data and statistics, we intend to connect viewers to depictions of these predicted future civilian-police encounters through human-driven storytelling.” 

Artificial Intelligence

Researchers develop AI system to improve eye disease detection

Amazon AWS’s AI team seeks the profound in the industrial

An AI ecosystem isn’t necessarily all cloud

What is AI? Everything you need to know about Artificial Intelligence

Related Topics:

Data Management

Digital Transformation

CXO

Internet of Things

Innovation

Enterprise Software

Jonathan Greig

By

Jonathan Greig

| October 14, 2021

| Topic: Artificial Intelligence

LEAVE A REPLY

Please enter your comment!
Please enter your name here