Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn't ready to take on the role of the physician.”
https://www.404media.co/chatbots-health-medical-advice-study/
@404mediaco I could have, and did say this alongside many others. One again, I remind everyone that large language models #LLM DONT KNOW ANYTHING. They are faking it. Everything they produce is intended to superficially emulate real human behavior. They have NO expertise.
This has gotten so out of hand that I had to chastise my own doctor the other day for using a (bogus) #AI tool to look up relevant medical literature. To the tool company’s credit, they included a disclaimer on their website that the output of the tool could be a fabrication and should not be used for actual medical reasons (!). For some astonishing reason, medical groups are paying big money to make the tool available to doctors. Big trouble ahead.