Connect with us

Hi, what are you looking for?

Webinar News NetworkWebinar News Network

Tech News

Hospitals use a transcription tool powered by a hallucination-prone OpenAI model

An illustration of a woman typing on a keyboard, her face replaced with lines of code.
Image: The Verge

A few months ago, my doctor showed off an AI transcription tool he used to record and summarize his patient meetings. In my case, the summary was fine, but researchers cited by ABC News have found that’s not always the case with OpenAI’s Whisper, which powers a tool many hospitals use — sometimes it just makes things up entirely.

Whisper is used by a company called Nabla for a medical transcription tool that it estimates has transcribed 7 million medical conversations, according to ABC News. More than 30,000 clinicians and 40 health systems use it, the outlet writes. Nabla is reportedly aware that Whisper can hallucinate, and is “addressing the problem.”

A group of researchers from Cornell University, the University of Washington, and…

Continue reading…

You May Also Like

Editor's Pick

Jennifer J. Schulp and Jack Solowey What do Yankees tickets and Pokémon cards have in common? If you guessed wish list items for elementary...

Editor's Pick

James A. Dorn In her recent article in Business Insider, Linette Lopez, a graduate of the School of Journalism at Columbia University, argues that China’s...

Editor's Pick

Colleen Hroncich Nicholas Ellis wants to tell a different story about what it means to flourish. “What does it mean to win at life...

Editor's Pick

S&P 500 earnings are in for 2024 Q2, and here is our valuation analysis. The following chart shows the normal value range of the...