Whisper AI Transcriptions Prone to Hallucinations
Researchers find Whisper's AI often adds inaccuracies, including racial commentary and false medical data, in eight of ten transcriptions.
File
OpenAIs Whisper Transcription Tool Has Hallucination Issues, Researchers Say chatgptnews teslaai
Added on 01/29/2025
Speakers
add Add new speaker

Speaker 1: OpenEye's Whisper transcription tool has hallucination issues, researchers say. Researchers find hallucinations in Whisper transcriptions from OpenEye's AI software. Researchers say Whisper has introduced everything from racial commentary to imagined medical treatments into transcripts. A University of Michigan researcher studying public meetings found hallucinations in eight out of every ten audio transcriptions. An OpenEye spokesperson said the company is continually working to improve the accuracy of our models.

ai AI Insights
Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Enter your query
Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Quizzes

Create interactive quizzes based on the content of the transcript to test comprehension or engage users.

Generate
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript