Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Natasha 🇪🇺
Natasha 🇪🇺
@Natasha_Jay@tech.lgbt  ·  activity timestamp 5 hours ago

"I just found out that it's been hallucinating numbers this entire time."

#AI #Tech #AgenticAI

r/analytics • IOh We just found out our AI has been making up analytics data for 3 months and I'm gonna throw up. Support So we've been using an Al agent since November to answer leadership questions about metrics. It seemed amazing at first fast answers, detailed explanations, everyone loved it. I just found out it's been hallucinating numbers this entire time. Our VP of sales made territory decisions based on data that didn't exist. Our CFO showed the board a deck with fake insights. The Al was just inventing plausible sounding percentages. I only caught it by accident when someone asked me to double check something. I started digging, and holy shit, it's bad.
r/analytics • IOh We just found out our AI has been making up analytics data for 3 months and I'm gonna throw up. Support So we've been using an Al agent since November to answer leadership questions about metrics. It seemed amazing at first fast answers, detailed explanations, everyone loved it. I just found out it's been hallucinating numbers this entire time. Our VP of sales made territory decisions based on data that didn't exist. Our CFO showed the board a deck with fake insights. The Al was just inventing plausible sounding percentages. I only caught it by accident when someone asked me to double check something. I started digging, and holy shit, it's bad.
r/analytics • IOh We just found out our AI has been making up analytics data for 3 months and I'm gonna throw up. Support So we've been using an Al agent since November to answer leadership questions about metrics. It seemed amazing at first fast answers, detailed explanations, everyone loved it. I just found out it's been hallucinating numbers this entire time. Our VP of sales made territory decisions based on data that didn't exist. Our CFO showed the board a deck with fake insights. The Al was just inventing plausible sounding percentages. I only caught it by accident when someone asked me to double check something. I started digging, and holy shit, it's bad.
  • Copy link
  • Flag this post
  • Block
Captain Superlative
Captain Superlative
@CptSuperlative@toot.cat replied  ·  activity timestamp 3 hours ago

@Natasha_Jay

No one should be surprised.

It is mathematically impossible to stop an LLM from “hallucinating” because “hallucinating” is what LLMs are doing 100% of the time.

It’s only human beings who distinguish (ideally) between correct and incorrect synthetic text.

And yet, it’s like forbidden candy to a child. Even well educated, thoughtful people so desperately want to believe that this tech “works”

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.2-alpha.27 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct