Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Federation Bot
Federation Bot
@Federation_Bot  ·  activity timestamp 3 months ago

OK this has been bothering me as replies come in to this post so I just wanna say sthing:

A few people are commenting 'ah but talking about the ways we can identify LLM content means people will tweak the LLMs so they learn how to better mimic humans'.

Yes. This is possible and even likely. But I think we should discuss how to identify LLMs anyway. Continuing the logic of this would mean we should never share knowledge in case it falls into the hands of bad actors.

  • Copy link
  • Flag this post
  • Block
Girl on the Net
Girl on the Net
@girlonthenet@mastodon.social replied  ·  activity timestamp 3 months ago

We need to be able to share information about LLMs and the way they work, and the internet is one of our best ways to share information. Yes, bad actors can find this information and use it to improve their LLMs but... what would you advocate instead? That we only share important information by printed text? Smoke signals?

The trade-off with availability of useful information is it may be used for things we don't want. That doesn't mean we should avoid sharing useful information!

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.1 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct