AI voices fool humans, but the brain’s response is different

Summary: People struggle to distinguish between human voices and AI-generated voices and only correctly identify them about half of the time. Despite this, brain scans revealed different neural responses to human and AI voices, with human voices triggering areas related to memory and empathy, while AI voices activated areas for error detection and attention regulation.

These findings highlight both the challenges and potential of advanced voice AI technology. Further research will examine how personality traits influence the ability to discern the origin of a voice.

Key facts:

  1. Identification struggles: Participants correctly identified human voices 56% of the time, AI voices 50.5%.
  2. Nervous responses: Human voices activated areas of memory and empathy; AI voices triggered error detection and attention regulation.
  3. Perceptual bias: Neutral voices were often perceived as AI, while happy voices were perceived as human.

Source: FENS

Humans are not very good at distinguishing between human voices and those generated by artificial intelligence (AI), but our brains respond differently to human and AI voices, according to research presented at the Federation of European Societies for Neurosciences (FENS) today (Tuesday). . Forum 2024.

The study was presented by PhD student Christine Skjegstad and conducted by Ms Skjegstad and Professor Sascha Frühholz, both from the Department of Psychology at the University of Oslo (UiO), Norway.

Ms Skjegstad said: “We already know that AI-generated voices are so advanced that they are almost indistinguishable from real human voices. It is now possible to clone a person’s voice from just a few seconds of recording, and fraudsters have used this technology to impersonate a loved one in need and trick victims into transferring money.

“While machine learning experts are developing technological solutions for AI voice detection, much less is known about the human brain’s response to these voices.”

For happy human voices, the correct identification rate was 78% compared to just 32% for happy AI voices, suggesting that people associate happiness as more human. Credit: Neuroscience News

The research involved 43 people who were asked to listen to human and AI-generated voices expressing five different emotions: neutral, anger, fear, joy, pleasure. They were asked to identify voices as synthetic or natural while their brains were studied using functional magnetic resonance imaging (fMRI).

fMRI is used to detect changes in blood flow in the brain, which shows which parts of the brain are active. Participants were also asked to rate the characteristics of the voices they heard in terms of naturalness, trustworthiness, and authenticity.

Participants correctly identified human voices only 56% of the time and AI voices 50.5% of the time, meaning they were equally bad at identifying both types of voices.

Humans were more likely to correctly identify a “neutral” AI voice as AI (75% compared to 23% of those who correctly identified a neutral human voice as human), suggesting that humans assume neutral voices are more AI-like.

Female neutral AI voices were correctly identified more often than male neutral AI voices. For happy human voices, the correct identification rate was 78% compared to just 32% for happy AI voices, suggesting that people associate happiness as more human.

Both AI and human neutral voices were perceived as the least natural, trustworthy and authentic, while human happy voices were perceived as the most natural, trustworthy and authentic.

However, looking at brain imaging, the researchers found that human voices elicited stronger responses in areas of the brain associated with memory (right hippocampus) and empathy (right inferior frontal gyrus).

AI voices elicited stronger responses in areas related to error detection (right anterior middle cingulate cortex) and attentional regulation (right dorsolateral prefrontal cortex).

Ms Skjegstad said: “My research shows that we are not very accurate in identifying whether a voice is human or AI. Participants also often expressed how difficult it was for them to tell the difference between the voices. This suggests that current AI voice technology can mimic human voices to the point where it is difficult for humans to reliably distinguish them.

“The results also suggest a perception bias, with neutral voices more likely to be identified as AI-generated and happy voices more likely to be identified as more human, regardless of whether they actually were. This was especially the case with neutral female AI voices, which may be due to our familiarity with female voice assistants like Siri and Alexa.

“Even though we’re not very good at identifying a person from AI voices, there seems to be a difference in the brain’s response. AI voices can create heightened vigilance, while human voices can create a sense of kinship.”

The researchers now plan to investigate whether personality traits, such as extraversion or empathy, make people more or less sensitive to noticing differences between human and AI voices.

Professor Richard Roche is Chair of the FENS Forum Communications Committee and Deputy Head of the Department of Psychology at Maynooth University, Maynooth, County Kildare, Ireland, and was not involved in the research.

He said: “Exploring the brain’s responses to AI voices is crucial as this technology is constantly evolving. This research will help us understand the potential cognitive and social implications of voice AI technology, which can support policy and ethical guidelines.

“The risks of using this technology to defraud and deceive people are obvious. But there are also potential benefits, such as providing voice replacements to people who have lost their natural voice. AI voices could also be used in the therapy of some mental illnesses.”

About this AI and neuroscience research news

Author: Kerry Noble
Source: FENS
Contact: Kerry Noble – FENS
Picture: Image is credited to Neuroscience News

Original Research: The findings will be presented at FENS Forum 2024

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top