Flutterby™!
: OpenAI Whisper speech recognition hallucinations
OpenAI Whisper speech recognition hallucinations
2024-10-26 20:50:00.202129+02 by
Dan Lyke
0 comments
Hmmmm... Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said
A speaker in another recording described “two other girls and one lady.” Whisper invented extra commentary on race, adding "two other girls and one lady, um, which were Black.”
In a third transcription, Whisper invented a non-existent medication called “hyperactivated antibiotics.”
Some of the examples in Careless Whisper: Speech-to-Text Hallucination Harms are pretty amazingly egregious.
https://doi.org/10.48550/arXiv.2402.08021
[ related topics:
Artificial Intelligence
]
comments in ascending chronological order (reverse):
Comment policy
We will not edit your comments. However, we may delete your
comments, or cause them to be hidden behind another link, if we feel
they detract from the conversation. Commercial plugs are fine,
if they are relevant to the conversation, and if you don't
try to pretend to be a consumer. Annoying endorsements will be deleted
if you're lucky, if you're not a whole bunch of people smarter and
more articulate than you will ridicule you, and we will leave
such ridicule in place.
Flutterby™ is a trademark claimed by
Dan Lyke for the web publications at www.flutterby.com and www.flutterby.net.