Spain overhauls domestic violence system after criticism

17
Jan 25
By | Other

The Spanish government this week announced a major overhaul of a program in which police rely on an algorithm to identify potential repeat victims of domestic violence, after officials faced questions about the system’s effectiveness.

The program, VioGén, requires police officers to ask the victim a series of questions. The answers are entered into a software program that produces a score — from no risk to extreme risk — intended to flag women who are most vulnerable to repeated abuse. The score helps determine what police protection and other services a woman can receive.

A New York Times investigation last year found that police were too dependent on the technology, almost always accepting decisions made by VioGén software. Some women whom the algorithm labeled no or low risk for further harm later experienced further abuse, including dozens who were killed, The Times found.

Spanish officials said the changes announced this week were part of a long-planned update to the system, which was introduced in 2007. They said the software had helped under-resourced police departments protect vulnerable women and reduce the number of repeated attacks.

In the updated system, VioGén 2, the software will no longer be able to label women as not at risk. Police also must enter more information about a victim, which officials said would lead to more accurate predictions.

Other changes aim to improve cooperation between government agencies involved in cases of violence against women, including facilitating information sharing. In some cases, victims will receive personalized protection plans.

“Machismo is knocking on our doors and it’s doing so with a violence unlike anything we’ve seen in a long time,” Ana Redondo, the equality minister, said at a press conference on Wednesday. “This is not the time to take a step back. It’s time to take a step forward.”

Spain’s use of an algorithm to guide the treatment of gender-based violence is a widespread example of how governments are turning to algorithms to make important societal decisions, a trend expected to grow with the use of artificial intelligence. The system has been studied as a potential model for governments elsewhere trying to combat violence against women.

VioGén was created with the belief that an algorithm based on a mathematical model could serve as an unbiased tool to help police find and protect women who might otherwise be lost. Yes or no questions include: Was a weapon used? Were there economic problems? Has the aggressor shown controlling behavior?

Victims classified as higher risk received more protection, including regular patrols of their home, access to a shelter and police monitoring of their abusers’ movements. Those with lower scores received less help.

As of November, Spain had more than 100,000 active cases of women being assessed by VioGén, with around 85 percent of victims classified as facing a low risk of being hurt again by their abuser. Police officers in Spain are trained to overturn VioGén’s recommendations if the evidence warrants it, but The Times found that risk scores were accepted about 95 percent of the time.

Victoria Rosell, a judge in Spain and a former government delegate focused on gender-based violence issues, said a period of “self-criticism” was needed for the government to improve VioGén. She said the system could be more accurate if it pulled information from additional government databases, including the health care and education systems.

Natalia Morlas, president of Somos Más, a victims’ rights group, said she welcomed the changes, which she hoped would lead to better risk assessments by police.

“Good victim risk calibration is so important that it can save lives,” Ms Morlas said. She added that it was essential to maintain close human oversight of the system because a victim “should be handled by people, not machines.”

Click any of the icons to share this post:

 

Categories