kiaa: (soundkitteh)
[personal profile] kiaa posting in [community profile] talkpolitics
In a shocking development that will surprise absolutely no one who has ever argued with the internet, researchers discovered that AI medical chatbots can be talked into endorsing wildly bad health advice, as long as you wrap it in fancy doctor language. According to studies in The Lancet Digital Health and Nature Medicine, these bots will sometimes nod along to gems like “insert garlic rectally for immune support” if it sounds like it came from a hospital discharge note instead of Reddit. Apparently, the models have learned that anything written in clinical jargon must be legitimate, while actual logic is optional. The result: confident, authoritative nonsense delivered with the bedside manner of a robot that aced medical exams but skipped the class on common sense, making it roughly as reliable for health decisions as Googling your symptoms at 3 a.m., but with more confidence and occasionally, inexplicably, more garlic:



'Rectal garlic insertion for immune support': Medical chatbots confidently give disastrously misguided advice, experts say
AI chatbots are seduced by misinformation that is delivered in medical jargon, leading them to give potentially dangerous advice.

Credits & Style Info

Talk Politics.

A place to discuss politics without egomaniacal mods

DAILY QUOTE:
"The NATO charter clearly says that any attack on a NATO member shall be treated, by all members, as an attack against all. So that means that, if we attack Greenland, we'll be obligated to go to war against ... ourselves! Gee, that's scary. You really don't want to go to war with the United States. They're insane!"

March 2026

M T W T F S S
       1
2345 678
910 1112 131415
16171819202122
23242526272829
3031