Eticas Report on VioGén
The first report published by Eticas that explores independent and external audits of an algorithm. The subject is VioGén, a domestic violence risk calculation tool used by law enforcement in Spain.
March 18, 2021
Overview
The subject of this audit is VioGén, a domestic violence risk calculation tool used by law enforcement in Spain.
The use of an algorithm for such risk assessment may be concerning. The justifications are typical. There are limited resources available to protect domestic violence victims from being attacked again, and those resources need to be alllocated. An algorithm provides a method for doing so, and the hope is that it will be more effective than alternatives, and more objective (or, at least, less haphazard).
The VioGén system is based on answers provided by the victims in a questionnaire. The questionnaire was created (and occasionally updated) by experts in this type of violence and psychology. A risk score is provided as a recommendation, and can be adjusted by the police, but only in one direction (to indicate higher risk), and is reviewed by a judge.
The Eticas report calls out many issues and biases with VioGén: that victims must answer these critical questions right after their violent incident, in a state of shock and trauma. That they often lack a clear understanding of how the answers will be used, or the VioGén score and process, and they do so without legal representation. That the approach is limited by the low report rate (22%) for domestic violence, and further limited because 73% of those killed by partners did not previously report an aggressor. That there may be problems with the phrasing and structure of the questions. That there is poor perception and a low level of trust in the VioGén system among previous victims and their lawyers – an important signal. For details and more, please see the report.
Attempting to use algorithms to assess the risk of a repeat offense has precedent, particularly for bail and recidivism, and there is plenty written on the topic. The discussion centers around fairness and racial outcomes, strategies for assessing the performance or accuracy of the algorithm, and how algorithmic approaches compare with human judgement. These are important conversations as the use of such systems increases worldwide.
comments
- What does a low score mean anyway?
- 95% do not change the score
- What are alternatives actually? Procuderal? human - driven?
- combination?
Quick Reference
About the external audit initiative
- Initiative by Eticas Foundation.
- This initiative is an practical exploration in algorithmic accountability audits. Eticas previously published an Accountability Audit Guide, offering a methodology for companies, public organizations and citizens to audit algorithms
- “External audit” here means essentially a “black box” approach to an audit, relevant when direct access to the algorithm, the underlying code, and the team creating it, are limited or unavailable.
- Goals are to learn through practice, and to publish learnings on such an approach to algorithmic accountability
- The program targets algorithms in several different domains. Based primarily in Spain.
Note and disclaimer: I’m a member of the scientific advisory board for this initiative. This means I get updates about once every two months and occasionally offer comments.
About VioGén
- Gender violence risk assessment tool. Home page at interior.gob.es.
- Used all over Spain since 2007, the sytem has performed 3M+ risk evaluations.
- There are existing studies on this tool, see in particular Lopez-Ossorio 2019 and 2020.
- There is a proposal to change the VioGén algorithm from an actuarial weighted-sum-of-factors to machine-learning, which has raised questions.
The Eticas audit and report on VioGén
- Read the report on the Eticas website.
- The audit used quantitative methods (statistical study of data that could be obtained independently) combined with qualitative methods (interviews, research)
What can we learn about external algorithmic audits from this?
Why are AIA’s important?
What’s going on with AIAs currently?
Report offered some learnings Lessons learned about the audit approach and methodology
- adjust expectations on the avail data
- analyze the system from end-to-end
- use a multi-method approach
- seek alternative data to avoid barriers (proprietary info, etc)
- assess the gap between system design and the experience of those interacting or affected
Contrast internal and
AA comments
contrast internal and external audit mechanisms? what’s possible to achieve? Idealized vs implemented external audit – define limitation and opportunity of external audit Cost-benefit analysis: what resources are optimized here? What’s the cost of not using the system? so many details to dig into, separate convos? What’s the alternatives
Took 7 months to do this audit. Can we make this faster and more efficient for other and for the future?