Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
https://buttondown.com/maiht3k/archive/why-you-should-refuse-to-let-your-doctor-record/
Post
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
https://buttondown.com/maiht3k/archive/why-you-should-refuse-to-let-your-doctor-record/
Yes, and of course said no.
But then I discovered they used AI transcribing when adding notes to my journal, after the meetings, as it was full of obvious errors. So needed to lecture them again about my right it is not used on my medical record.
What makes this even worse is that they all know how bad it works, as it is frequently reported in media about complaints from the medical community about horrific errors, as well as inefficiency this overhyped piece of crap creates.
@emilymbender no but yesterday, I did see a poster mentioning it at a local clinic
@emilymbender thankfully my therapist was like "yeah dude don't worry about it it's weird" but i still get an email alongside every 'upcoming appointment' email reminding me to sign the permission form
@emilymbender
One of my first jobs was providing tech support to doctors in a hospital setting. They were some of the most tech-illiterate folks I've ever encountered. They have no concept of operational security.
No doctor has ever asked me for permission to store any information about me in whatever systems they're using. For all I know they store it in plain text on an insecure S3 bucket.
@emilymbender no but in the agreement they ask us to sign periodically it said that they might use AI. So I said I wasn’t signing if they were going to. They asked the doc and she said no I don’t use AI transcription at all and I didn’t know that was in there!
@emilymbender
I have-- and refused!
@emilymbender Funnily enough, transcribing can preserve privacy without issues. Whisper.cpp runs decently well on phones, and can be run on servers that process patient records under the same security constraints. Could easily be run locally even.
Problem is, that’s extremely hard to prove in the current „just slap a gear on it and call it steampunk” climate. I would definitely not trust a random provider.
And if they do „summarization”, forget privacy.
@emilymbender psychiatry did it without informed consent. I am livid
@emilymbender for a doctors perspective on the more profound side effects of “efficiency”
https://benngooch.substack.com/p/i-was-an-enthusiastic-early-adopter
“I felt myself becoming a passive observer in encounters where I had previously been an active architect. I felt my clinical memory, my narrative identity, and my sense of connection to my patients beginning to erode at the edges.”
@emilymbender Really helpful. Thanks for sharing.
I often reason about it this way. There are very few things like this where, if you opt out this time, you can’t opt in next time. On the other hand, there are LOTS of situations, this being a good example, where opting out after you opted in is substantially more effort (or even impossible).
Opting out by default is usually a safe thing to do. You can always opt in later if you change your mind.
@emilymbender
Evidence shows litigation decrease if Drs have scribes. A Dr isn’t allowed to remember things in defence. It’s said “If it is not documented, it didn’t happen” even if it did happen & recall can be verified.
The direct effect:
1: more litigation = more insurance cost for the Dr & thus higher consult fees.
2: Drs who have psychological & emotional injury from spurious claims reduce/stop practice.
So there is high motivation for having a scribe.
As some others pointed out - that evidence is based on recorded encounters being transcribed by humans, or possibly by non-AI speech-to-text.
Nobody has yet studied the resulting practice effects of AI/LLM-based interpretation. Recording, transcription, and interpretation are separate realms and skill sets and “AI” is either unnecessary or unproven in all 3.
@emilymbender I opted out at my physical therapist last week. They told me all their patients who work in tech have opted out.
yes i was a declined. when she asked, i said because
1. the companies used sites with CSAM and other abuses
2. it’s spyware. each prompt acts like a honey-pot. since you are giving them the info, it by-passes HIPPA. in turn they get to use and sell that info however they please
3. as an antifascist activist, it puts my life in danger by giving companies ran by fascists access to my whole medical history.
my MD was shocked. they had no idea about the spyware angle or CSAM
@emilymbender Yes, I was asked to sign a consent (stuck in with the other standard consents) authorizing the doctor’s practice to use an AI scribe. I left the room, went up to the front desk and told them I would not sign the consent under any circumstances. They looked a little surprised, but agreed to have one of the techs act as a scribe as normal. Glad I stood my ground - there is no way in Hell I would let a Doc use AI for anything medical related
@emilymbender Im fortunate my gp doesn't even trust the national health database.