Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Tom Morris
Tom Morris
@tommorris@mastodon.social  ·  activity timestamp 2 weeks ago
Kevin Beaumont
Kevin Beaumont
@GossiTheDog@cyberplace.social  ·  activity timestamp 2 weeks ago

Twitter’s Grok AI is now being used by people to undress women, as verified by the BBC. Twitter refused to reply, simply saying “legacy media lies”. It isn’t a lie.

In the UK it is illegal to create or share non-consensual intimate images, including via AI. The UK gov say they are investigating.

https://www.bbc.co.uk/news/articles/c98p1r4e6m8o

RE: https://cyberplace.social/@GossiTheDog/115826683070912702

The UK government’s official internet policy:

1. if you run a web forum or fedi instance, you need to do a bunch of paperwork to comply with the #OnlineSafetyAct to protect the kids.

2. if you are rich and run a neo-nazi website with officially posted virtual CSAM, politicians won’t leave despite that being a million times worse than anything you’d read on a forum about fixed gear cycling or a mastodon server run by some queer furries running arch for their polycule.

Incredibly coherent.

  • Copy link
  • Flag this post
  • Block
Collette
Collette
@collette@mastodon.ie replied  ·  activity timestamp 2 weeks ago

@tommorris Here's the wayback version (unblocked) https://web.archive.org/web/20190629101643/https://www.vox.com/2019/6/27/18761639/ai-deepfake-deepnude-app-nude-women-porn

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.1-beta.35 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct