Please enable JavaScript to experience the full functionality of GMX.

AI Now Institute wants to restrict emotion-detecting technology

The AI Now Institute has called for new laws to restrict the use of emotion-detecting technology.

The US-based body has said the field - which is currently on sale to help vet job seekers, test criminal suspects for signs of deception, and set insurance prices - is "built on markedly shaky foundations", and wants to see the software banned from use in important decisions that affect people's lives.

In AI Now's annual report, it claims affect recognition is undergoing a period of significant growth and could already be worth as much as $20 billion, but warns there is "no substantial evidence" to support the technology.

Affect recognition scans people's faces to detect facial expressions that could give away a person's emotional state, but AI Now insists the tests aren't always accurate.

AI Now co-founder Professor Kate Crawford explained: "It claims to read, if you will, our inner-emotional states by interpreting the micro-expressions on our face, the tone of our voice or even the way that we walk.

"It's being used everywhere, from how do you hire the perfect employee through to assessing patient pain, through to tracking which students seem to be paying attention in class.

"At the same time as these technologies are being rolled out, large numbers of studies are showing that there is ... no substantial evidence that people have this consistent relationship between the emotion that you are feeling and the way that your face looks."

The software is currently used in a variety of different ways, and some companies have already spoken in defence of the technology following AI Now's report.

Oxygen Forensics offers emotion-detecting software to the police, and insisted the tech will help "make the world a safer place".

Chief operating officer, Lee Reiber said: "The ability to detect emotions, such as anger, stress, or anxiety, provide law-enforcement agencies additional insight when pursuing a large-scale investigation. Ultimately, we believe that responsible application of this technology will be a factor in making the world a safer place."

Whilst Emteq - a Brighton-based firm trying to integrate emotion-detecting tech into virtual-reality headsets - agreed with AI Now's statements, saying facial expressions aren't always caused by an emotion.

The company's founder Charles Nduka said: "One needs to understand the context in which the emotional expression is being made. For example, a person could be frowning their brow not because they are angry but because they are concentrating or the sun is shining brightly and they are trying to shield their eyes. Context is key, and this is what you can't get just from looking at computer vision mapping of the face."

Sponsored Content