Research: AI transcription tool generates made-up statements

Hospitals increasingly are adopting OpenAI’s artificial intelligence-powered transcription tool, Whisper, despite warnings from software engineers, developers and academic researchers that it invents text. The invented text ─ known in the industry as hallucinations ─ can include racial commentary, violent rhetoric and imagined medical treatments. Whisper already serves thousands of companies, and it transcribes and translates text into multiple languages. Although OpenAI recommends in online disclosures against using Whisper in “decision-making contexts, where flaws in accuracy can lead to pronounced flaws in outcomes,” hospitals use Whisper to transcribe doctor’s visits to enable providers to spend less time on note taking or report writing. (MedPage Today article, 10/28/24)