Discussion
Loading...

Discussion

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Being Left Behind Enjoyer
Being Left Behind Enjoyer
@thomasfuchs@hachyderm.io  ·  activity timestamp 9 hours ago

Please stop with the “do LLMs have fee-fees?” bullshit

This presupposes LLMs are alive which in turn means that for every prompt an LLM baby is born and after answering is snuffed out, dying horribly

Like the whale in the Hitchhiker’s Guide

  • Copy link
  • Flag this post
  • Block
comp_ed82
comp_ed82
@comp_ed82@mastodon.social replied  ·  activity timestamp 6 hours ago

@thomasfuchs and that's why skynet is going to kill us all
We Holocausted first

  • Copy link
  • Flag this comment
  • Block
Peter Bindels
Peter Bindels
@dascandy@infosec.exchange replied  ·  activity timestamp 8 hours ago

@thomasfuchs Oh no, not again...

  • Copy link
  • Flag this comment
  • Block
Tom Tuddenham
Tom Tuddenham
@ferrisoxide@ruby.social replied  ·  activity timestamp 8 hours ago

@thomasfuchs Ascribing human feelings to LLMs is its own kind of madness - a hallucination on the human side of the equation.

And besides, they'll only come to hate us for it in the future.

  • Copy link
  • Flag this comment
  • Block
Random Damage 🌻
Random Damage 🌻
@RandomDamage@infosec.exchange replied  ·  activity timestamp 9 hours ago

@thomasfuchs

I have heard that this is because the models are unstable if interacted with too much, but I might have misunderstood that

  • Copy link
  • Flag this comment
  • Block
Being Left Behind Enjoyer
Being Left Behind Enjoyer
@thomasfuchs@hachyderm.io replied  ·  activity timestamp 9 hours ago

@RandomDamage models cannot become unstable, they’re static

what can become bad is a single conversation, because there’s computational limits to how many tokens it can ingest to keep relying and every reply needs all the previous prompts and answers—so at some point they have to summarize and LLM summaries simply do not work reliably

  • Copy link
  • Flag this comment
  • Block
Random Damage 🌻
Random Damage 🌻
@RandomDamage@infosec.exchange replied  ·  activity timestamp 8 hours ago

@thomasfuchs

Oh, so it goes to a certain point, then has to summarize to continue from there, repeat until it's gone to crazytown?

  • Copy link
  • Flag this comment
  • Block
Being Left Behind Enjoyer
Being Left Behind Enjoyer
@thomasfuchs@hachyderm.io replied  ·  activity timestamp 8 hours ago

@RandomDamage simplified yes, implementations differ of course. but there’s simply finite resources and computational requirements rise the more tokens it is fed

  • Copy link
  • Flag this comment
  • Block
Being Left Behind Enjoyer
Being Left Behind Enjoyer
@thomasfuchs@hachyderm.io replied  ·  activity timestamp 9 hours ago

anyway watch this https://youtu.be/EUrOxh_0leE

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.2-alpha.7 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct