AI can detect fear and danger in the voices of abused women, according to a study

Can the voice of a victim of gender violence and the sounds that surround her be key to her protection? The artificial intelligence Can it be useful in detecting sexist violence? The Spanish researcher Esther Rituerto has shown that she does and has been awarded for it.

The Government Delegation against Gender Violence in Spain recognized Rituerto’s work in the latest edition of its awards for doctoral theses on violence against women, for “deepening the use of audio technology and artificial intelligence to prevent and combat” sexist abuse.

MIRA: These hearing aids allow you to translate up to 40 languages ​​in real time | VIDEO

Rituerto explains to EFE that his work had as The objective is to use speech and acoustic signals around a woman to detect when a situation may be dangerous. for her.

Specifically, detect fear in her voice to determine if she is at risk of being attacked by her aggressor, as well as surrounding sounds that warn of danger: accelerated footsteps at dawn, blows, objects that break…

From its thesis, the Government Delegation against Gender Violence highlights that it uses artificial intelligence to understand women’s reactions to risk or danger situationswith the aim of being able to generate automatic mechanisms for detecting these situations based on the auditory modality in particular.

Furthermore, he highlights that “The entire methodological process used is extremely interesting and rigorous, as well as respectful of the victims.”.

The institution, dependent on the Ministry of Equality, considers that “The medium-term impact of this research on protection systems for victims of gender violence can be very relevant.”.

MIRA: Artificial intelligence: you can now predict the taste and quality of beer

Detect fear

Rituerto’s thesis was born within UC3M4Safety, a project of the Carlos III University of Madrid that seeks to use technology in the prevention, detection, protection and fight against gender violence.

A research team with profiles from different disciplines was formed to develop electronic devices capable of detecting fear in a victim of gender violence through your voice or your vital signs (temperature, pulse, sweating…).

And that these reactions will trigger the smartphone send an alert to the emergency services and the Police or to a close circle of the victim.

The researcher, an audiovisual systems engineer, points out that the devices (a pendant with a microphone to capture sounds) and a bracelet (to detect biological markers) are linked to a smartphone, which in turn is connected to a cloud system. .

Artificial intelligence helps determine when a situation is risky and an automatic response must be sent to help the victim.

Rituerto specifies that these devices may be in operation in a few years (for the moment the tests have been limited to the laboratory) and that In the near future they will begin to be tested on the street.

MIRA: Voice Engine, the new AI from OpenAI that can clone voices just by listening to them for 15 seconds

Victim status

The engineer was also involved in another investigation on the use of artificial intelligence to detect the status of victim of gender violence through voice, that is, this technology is capable of determining if a woman has suffered abuse by analyzing her voice. . The developed system achieved an accuracy of 73%.

Rituerto underlines that The algorithm trained on victim data was able to discern palpable differences between women who were victims and those who were not.which had to do with the rhythm of the voice, the intensity, the number of words used, the speed of the speech…

We saw that he was able to separate between victims and non-victims because there is something in his voice that differentiates them.”, he points out.

Currently, he works at the LMU Klinikum University Hospital and the Max Planck Institute of Psychiatry in Germany, where he researches to detect mental illnesses using artificial intelligence in brain magnetic resonance images.

By Editor

Leave a Reply