Discussion
Loading...

#Tag

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Charlie Stross
Charlie Stross
@cstross@wandering.shop  ·  activity timestamp 2 days ago

LLMs are spam generators. That is all.

They're designed to generate plausibly human-like text well enough to pass a generic Turing Test. That's why people believe they're "intelligent".

But really, all they are is spam generators.

We have hit the spamularity.

offray
offray
@offray@mastodon.social replied  ·  activity timestamp 2 days ago

@cstross I wrote an essay draft in my masters (early 2000s) against the Turing test as proof of intelligence. There I said that if a similar approach were taken to study life, we would be assigning life to strawmans as they seem alive enough to scare crows (at least to them). Instead, I proposed an #EcologyOfIntelligence where the key is not imitation but completariety/synergy between intelligences: individual, collective, human, non human, etc.

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.1-beta.35 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct