Continue, or "try it now" a popup from Gmail now asks, offering to compose your next message with Gemini. I guess the tiny "x" is the "fuck no" button?
AI assistants are the archnemesis
of data privacy.
Because AI models are inherent
data collectors.
They rely on large data collection for training, improvement, operation, and customization.
More often than not, this data is collected without clear and informed consent (from unknowing training subjects or from platform users). This data is then sent to and accessed by a private company with many incentives to share and monetize this data.
By using these platforms, we are encouraging them to collect even more nonconsensual data on everyone. This is an important social responsibility to consider. Choose carefully.
AI assistants are the archnemesis
of data privacy.
Because AI models are inherent
data collectors.
They rely on large data collection for training, improvement, operation, and customization.
More often than not, this data is collected without clear and informed consent (from unknowing training subjects or from platform users). This data is then sent to and accessed by a private company with many incentives to share and monetize this data.
By using these platforms, we are encouraging them to collect even more nonconsensual data on everyone. This is an important social responsibility to consider. Choose carefully.
"Your prompts remain private while using our AI" isn't the pro-privacy statement most AI companies seem to believe it to be.
What about the data of the subjects you used to train your AI model? Did you ask consent from all of them? What's that? Your product isn't possible if you ask for consent first?
Well, it's definitely not private then.
"Your prompts remain private while using our AI" isn't the pro-privacy statement most AI companies seem to believe it to be.
What about the data of the subjects you used to train your AI model? Did you ask consent from all of them? What's that? Your product isn't possible if you ask for consent first?
Well, it's definitely not private then.
There will never be an AI tool that
is truly private unless it hasn't trained on nonconsensual data.
Even if a platform were able to
create the perfect protections for its users' prompts and results,
If the platform is built from or utilizing an AI model that was trained on or is updated and optimized with data that was scraped from millions of people without their consent, then of course this platform isn't "privacy-respectful."
How could it be?
The company is saying:
"We respect the privacy of our users while they are using our platform, but outside of it, it's fair game."
Users thinking they are using a privacy-respectful platform are in fact saying:
"Privacy for me and not for thee,"
And are directly contributing to the platform needing to scrape even more nonconsensual data to improve.
Always ask: Where the training data comes from?
Without the assurance that a platform only uses AI models that have only been training on data acquired ethically, it is not a privacy-respectful platform.
There will never be an AI tool that
is truly private unless it hasn't trained on nonconsensual data.
Even if a platform were able to
create the perfect protections for its users' prompts and results,
If the platform is built from or utilizing an AI model that was trained on or is updated and optimized with data that was scraped from millions of people without their consent, then of course this platform isn't "privacy-respectful."
How could it be?
The company is saying:
"We respect the privacy of our users while they are using our platform, but outside of it, it's fair game."
Users thinking they are using a privacy-respectful platform are in fact saying:
"Privacy for me and not for thee,"
And are directly contributing to the platform needing to scrape even more nonconsensual data to improve.
Always ask: Where the training data comes from?
Without the assurance that a platform only uses AI models that have only been training on data acquired ethically, it is not a privacy-respectful platform.
Filming women without their consent and knowledge is absolutely disgusting and creepy as hell. Posting it online makes it even more despicable.
Our governments are refusing to protect us from this abuse.
Companies selling creep glasses equipped with camera should be publicly shunned to bankruptcy.
"The man who filmed her had posted over a hundred similar videos on his TikTok page, and he is not the only one making this kind of content."
Consent is the most important concept you can learn about to understand better data privacy rights.
And I'm not specifying free and informed consent here, because if it's not free and informed, it's not consent.
📣 NEW w/#UdbhavTiwari Mapping the technical reality & privacy/security perils of pushing AI agents into our infra
We offer palliatives, but the core issues are paradigmatic: 'agency' relies on pervasive data access + ability to act w/o explicit consent.
RE: https://mastodon.world/@Mer__edith/115854211176763097
Excellent #39c3 talk on so-called "agentic AI" and how it's infiltrating into operating systems. Key quote from the end: "Without implementation of the proposed [palliatives] we risk locking ourselves into a digital infrastructure where we are no longer the users of our devices but the managed resources of an automated economy" #agenticAI #security #agency #consent