Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
The Heart of the TARDIS
The Heart of the TARDIS
@TheHeartoftheTARDIS@pixelfed.social  ·  activity timestamp 5 days ago
#meme #memes #elonmusk #musk #twitter #spacex #feminism
Elon Musk’s Grok Al tool generated three million sexualised images in 11 days, according to estimates made by the Centre for Countering Digital Hate. 23,000 of these images were of children. Users of the feature had the ability to upload photos of people (private individuals or public figures) and request Grok create deepfake pornography of them. | The Guardian's reporting | gm revealed the tool was also being used to depict women being @ killed. A week after the peak of its usage, instead of shutting it down - it was made a paid | feature. Imagine discovering your product was used to create © = child exploitation material and instead of cutting off these functions, making it part of your commercial strategy for_w premium users. 'm
Elon Musk’s Grok Al tool generated three million sexualised images in 11 days, according to estimates made by the Centre for Countering Digital Hate. 23,000 of these images were of children. Users of the feature had the ability to upload photos of people (private individuals or public figures) and request Grok create deepfake pornography of them. | The Guardian's reporting | gm revealed the tool was also being used to depict women being @ killed. A week after the peak of its usage, instead of shutting it down - it was made a paid | feature. Imagine discovering your product was used to create © = child exploitation material and instead of cutting off these functions, making it part of your commercial strategy for_w premium users. 'm
Elon Musk’s Grok Al tool generated three million sexualised images in 11 days, according to estimates made by the Centre for Countering Digital Hate. 23,000 of these images were of children. Users of the feature had the ability to upload photos of people (private individuals or public figures) and request Grok create deepfake pornography of them. | The Guardian's reporting | gm revealed the tool was also being used to depict women being @ killed. A week after the peak of its usage, instead of shutting it down - it was made a paid | feature. Imagine discovering your product was used to create © = child exploitation material and instead of cutting off these functions, making it part of your commercial strategy for_w premium users. 'm
Pixelfed

pixelfed

Pixelfed

pixelfed

Pixelfed

pixelfed

Pixelfed

pixelfed

Pixelfed

pixelfed

Pixelfed

pixelfed

Pixelfed

pixelfed

  • Copy link
  • Flag this post
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.2-alpha.7 no JS en
Automatic federation enabled
Log in
  • Explore
  • About
  • Members
  • Code of Conduct