testo12@lemmynsfw.comtoTechnology@lemmy.world•ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plansEnglish
2·
1 year agoThat wouldn’t guarantee correct answers.
It’s arguably more dangerous if ChatGPT gives mostly sane specific medical advice because it makes people put more trust in it than they should.
Unidentified Flying Objects have always existed. In the US, around 99% of submitted cases were identified eventually or revealed to be bogus. The remaining 1% exist, but the track record of identified stuff doesn’t give the alien explanation a high likelihood.