Internet of Things (IOT), News

Artificial intelligence recognizes suicide thoughts

Artificial intelligence recognizes suicide thoughts
About Author
Christian Boas

Artificial intelligence, or AI, is supposed to predict suicide in public transport. For this purpose, the London company Human has developed an AI-based algorithm that reads emotions on the face. In order to detect extreme emotions, the software focuses on facial movements that occur in the millisecond range. The program can be used via mobile internet, API or surveillance cameras.

Detection of treble and depth
Just because someone shows clear signals of concern in his facial expressions does not mean that a suicide should be present. There are many other possible reasons for this.

“The mimic and body language of a human being give us reliable indications of how a human being feels. They do not reveal why the person is experiencing a particular emotion. For example, if someone pulls the eyebrows high and together, this is a relatively reliable indication that the person is just feeling anxiety, “says Dirk Eilert of the Eilert Academy of Emotional Intelligence.

Human is already working with a variety of traffic and emergency services in Europe, the USA and Canada. “Historical data indicate that our clients have preferred places where suicide attempts tend to take place more often,” explains Human founder Yi Xu against the “The Memo” portal.

prevention
The emotion-tracking technology can be used not only to prevent suicides, but also in other areas such as workplace, consumer centers or public authorities. In the sports or insurance sector, it can also be an advantage to determine mental health.

The company has even been certified by the British organization CamCare to identify potential gamblers. “We are addressing our technology to people who would otherwise be killed in the human race,” Xu said. The results are quantifiable results, suggesting extreme psychic ups and downs.

Abuse possible
As with other technologies, the risk of misuse can not be ruled out in the case of emotion tracking. This misuse of the results obtained by the AI ​​could, e.g. also lie in the fact that this is without the knowledge of the analyzed victim. For this reason clear regulations of the legislature are also required at this point, so that new technologies can only be used for the benefit of all.

To Top