Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Joseph Cox
Joseph Cox
@josephcox@infosec.exchange  ·  activity timestamp 2 hours ago

"The chatbots also generated information that was just wrong or incomplete, including focusing on elements of the participants’ inputs that were irrelevant, giving a partial US phone number to call, or suggesting they call the Australian emergency number." https://www.404media.co/chatbots-health-medical-advice-study/

404 Media

Chatbots Make Terrible Doctors, New Study Finds

Chatbots provided incorrect, conflicting medical advice, researchers found: “Despite all the hype, AI just isn't ready to take on the role of the physician.”
  • Copy link
  • Flag this post
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.2-alpha.7 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct