<
https://www.livescience.com/health/rectal-garlic-insertion-for-immune-support-medical-chatbots-confidently-give-disastrously-misguided-advice-experts-say>
"Popular AI chatbots often fail to recognize false health claims when they're
delivered in confident, medical-sounding language, leading to dubious advice
that could be dangerous to the general public, such as a recommendation that
people insert garlic cloves into their butts, according to a January study in
the journal
The Lancet Digital Health. Another study, published in February
in the journal
Nature Medicine, found that chatbots were no better than an
ordinary internet search.
The results add to a growing body of evidence suggesting that such chatbots are
not reliable sources of health information, at least for the general public,
experts told
Live Science.
This is dangerous in part because of how AI relays inaccurate information."
Via Joyce Donahue.
Cheers,
*** Xanni ***
--
mailto:xanni@xanadu.net Andrew Pam
http://xanadu.com.au/ Chief Scientist, Xanadu
https://glasswings.com.au/ Partner, Glass Wings
https://sericyb.com.au/ Manager, Serious Cybernetics