Whisper AI Transcriptions Prone to Hallucinations (Full Transcript)

Researchers find Whisper's AI often adds inaccuracies, including racial commentary and false medical data, in eight of ten transcriptions.
Download Transcript (DOCX)
Speakers
add Add new speaker

Speaker 1: OpenEye's Whisper transcription tool has hallucination issues, researchers say. Researchers find hallucinations in Whisper transcriptions from OpenEye's AI software. Researchers say Whisper has introduced everything from racial commentary to imagined medical treatments into transcripts. A University of Michigan researcher studying public meetings found hallucinations in eight out of every ten audio transcriptions. An OpenEye spokesperson said the company is continually working to improve the accuracy of our models.

ai AI Insights
Arow Summary

Generate a brief summary highlighting the main points of the transcript.

Generate
Arow Title

Generate a concise and relevant title for the transcript based on the main themes and content discussed.

Generate
Arow Keywords

Identify and highlight the key words or phrases most relevant to the content of the transcript.

Generate
Arow Key Takeaways

Extract key takeaways from the content of the transcript.

Generate
Arow Sentiments

Analyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.

Generate
Arow Enter your query
{{ secondsToHumanTime(time) }}
Back
Forward
{{ Math.round(speed * 100) / 100 }}x
{{ secondsToHumanTime(duration) }}
close
New speaker
Add speaker
close
Edit speaker
Save changes
close
Share Transcript