Discussion
Loading...

Post

Log in
  • About
  • Code of conduct
  • Privacy
  • Users
  • Instances
  • About Bonfire
Em :official_verified:
Em :official_verified:
@Em0nM4stodon@infosec.exchange  ·  activity timestamp 2 months ago

AI assistants are the archnemesis
of data privacy.

Because AI models are inherent
data collectors.

They rely on large data collection for training, improvement, operation, and customization.

More often than not, this data is collected without clear and informed consent (from unknowing training subjects or from platform users). This data is then sent to and accessed by a private company with many incentives to share and monetize this data.

By using these platforms, we are encouraging them to collect even more nonconsensual data on everyone. This is an important social responsibility to consider. Choose carefully.

#Privacy #Consent #Ethics #AI #NoAI

#consent #ethics #noai #privacy #ai
  • Copy link
  • Flag this post
  • Block
chrisp
chrisp
@chrisp@cyberplace.social  ·  activity timestamp 2 months ago

@Em0nM4stodon When Google Android uploads your nude pictures to the cloud it also trains on them. You can ask Google to remove them from the cloud, which they may do, but they are never removing them from the model. They don't know how to remove a single picture without rewinding the whole model back to the last version before it was added. And they aren't going to wind back months or years of training data for your single photo.

  • Copy link
  • Flag this comment
  • Block
...a pleasant rascal
...a pleasant rascal
@Nead@social.vivaldi.net  ·  activity timestamp last month

@chrisp @Em0nM4stodon Yeah, really glad I put the brakes on Google Photos. No more phone backups, all 20 years of shots moved to an encrypted, privacy focused app (Ente.io), then those same pics deleted from Google (trash too!).
Working on email currently: Gmail --> Proton Mail.

  • Copy link
  • Flag this comment
  • Block
chrisp
chrisp
@chrisp@cyberplace.social  ·  activity timestamp 2 months ago

@Em0nM4stodon Once it is in the model you don't get to remove it either. You might be ok with Facebook hovering up your data and your family's data now, but when your spouse dies in the future all their data is still in the models. Other people can "resurrect" them, or at least aspects of them.

  • Copy link
  • Flag this comment
  • Block
nemo™ 🇺🇦
nemo™ 🇺🇦
@nemo@mas.to  ·  activity timestamp 2 months ago

@Em0nM4stodon I've read that in an Ars Technica post, this quote was from a privacy expert referred to as em 💡.

  • Copy link
  • Flag this comment
  • Block
Em :official_verified:
Em :official_verified:
@Em0nM4stodon@infosec.exchange  ·  activity timestamp 2 months ago

@nemo awesome

  • Copy link
  • Flag this comment
  • Block
nemo™ 🇺🇦
nemo™ 🇺🇦
@nemo@mas.to  ·  activity timestamp 2 months ago

@Em0nM4stodon Sorry I hahah I've just now seen your reply xD

Exactly awesome 🙏

  • Copy link
  • Flag this comment
  • Block
...a pleasant rascal
...a pleasant rascal
@Nead@social.vivaldi.net  ·  activity timestamp last month

@dajb @chrisp @Em0nM4stodon Yes! I have had photos backing up to Proton Drive. Now I need to assess data limits for that plan. Ente auto uploads my Android phone pics so I'm good there. Maybe backing up pics to my self-hosted NextCloud instance? Thoughts on that?

  • Copy link
  • Flag this comment
  • Block

bonfire.cafe

A space for Bonfire maintainers and contributors to communicate

bonfire.cafe: About · Code of conduct · Privacy · Users · Instances
Bonfire social · 1.0.2-alpha.38 no JS en
Automatic federation enabled
Log in
Instance logo
  • Explore
  • About
  • Members
  • Code of Conduct