Charles U. Farley
Mre. Dartigen [maker mode]
Kookie 🍪🇵🇸
Charles U. Farley and 4 others boosted

"sideloading" is a stupid made up term invented to delegitimize installing software.
Heres a bunch of other things I'm doing while "sidestepping" some supposed central authority:
- sideshopping (buying stuff from a store that isn't amazon)
- sidedining (eating or making food that isn't from mcdonalds)
- sidethinking (using my own brain instead of asking chatgpt)
- sidelistening (to my own music instead of on spotify)
- sidechatting (irl instead of online)

#android #sideloading #google #bullshit

"sideloading" is a stupid made up term invented to delegitimize installing software.
Heres a bunch of other things I'm doing while "sidestepping" some supposed central authority:
- sideshopping (buying stuff from a store that isn't amazon)
- sidedining (eating or making food that isn't from mcdonalds)
- sidethinking (using my own brain instead of asking chatgpt)
- sidelistening (to my own music instead of on spotify)
- sidechatting (irl instead of online)

#android #sideloading #google #bullshit

This is, to put it mildly, utter bullshit.

You can store a decade of email for a million people—call it 10-20Gb each—in a 10-20Pb storage array that costs under £1M per year (and is constantly getting cheaper). 10 year old emails are *cold* storage: drives not even spinning most of the time: you only really need 5-10% of it available on demand.

The environment agency are gaslighting us. One wonders who put them up to it?
social.lol/@robb/1150165791501

@cstross
Think of all the processing power the police are using inputting all that data from people arrested for carrying #PalestineAction signs.

#bullshit#FuckStarmer

Als meer dan 90% van je antwoord pure onzin is… “winkelmandje vol” is GEEN titel van een lied of theaterprogramma van Yentl en de Boer.. niet een “fout” van AI, of “hallucinatie”, het is gewoon #bullshit en vervuilende #desinformatie.

Moet ik tegenwoordig een tweede zoekopdracht inschieten om te achterhalen of een eerste zoekresultaat klopt?

Is er een plug-in die alle AI antwoorden en eerste 4 pagina’s aan Google Adds niet wegfiltert maar gewoon in een vuilniscontainer kan storten?

Marketing departments are like neighborhood kids leaving bags of shit on your doorstep.

Marketing departments using generative AI are like neighborhood kids leaving burning bags of shit on your doorstep.

They're both shit, but one is being advertised as the new hotness while being objectively worse.

#AI#Marketing#Bullshit

A friend sent me the story of the LLM deleting a database during a code freeze and said "it lied when asked about it." I assert that a generative AI cannot lie. These aren't my original thoughts. But if you read Harry Frankfurt's famous essay On Bullshit (downloadable PDF here), he makes a very reasoned definition of bullshit. And this paragraph near the end of the essay explains why an LLM cannot lie.

It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction. A person who lies is thereby responding to the truth, and he is to that extent respectful of it. When an honest man speaks, he says only what he believes to be true; and for the liar, it is correspondingly indispensable that he consider his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.

And that's a generative artificial intelligence algorithm. Whether generating video, image, text, network traffic, whatever. It has no reference to the truth and is unaware of what truth is. It just says things. Sometimes they turn out to be true. Sometimes not. But that's irrelevant to an LLM. It doesn't know.

#bullshit #ai #llm #genai

A friend sent me the story of the LLM deleting a database during a code freeze and said "it lied when asked about it." I assert that a generative AI cannot lie. These aren't my original thoughts. But if you read Harry Frankfurt's famous essay On Bullshit (downloadable PDF here), he makes a very reasoned definition of bullshit. And this paragraph near the end of the essay explains why an LLM cannot lie.

It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction. A person who lies is thereby responding to the truth, and he is to that extent respectful of it. When an honest man speaks, he says only what he believes to be true; and for the liar, it is correspondingly indispensable that he consider his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose.

And that's a generative artificial intelligence algorithm. Whether generating video, image, text, network traffic, whatever. It has no reference to the truth and is unaware of what truth is. It just says things. Sometimes they turn out to be true. Sometimes not. But that's irrelevant to an LLM. It doesn't know.

#bullshit #ai #llm #genai