Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
https://buttondown.com/maiht3k/archive/why-you-should-refuse-to-let-your-doctor-record/
Post
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
https://buttondown.com/maiht3k/archive/why-you-should-refuse-to-let-your-doctor-record/
@emilymbender oh, fuck, that happened years ago here in Australia, and I didn't ever feel comfortable with it, ever
@emilymbender I would have thoughts, too. My privacy. There is never any guarantee but with A.I. involved once privacy is shreds, guaranteed.
@emilymbender I always want to say no, but there’s an unspoken social pressure to just agree, especially when you’re lucky to have 30 seconds of time with your doctor.
@emilymbender No, I was told at the end of the consultation that everything was recorded and that 'it' was making a report now.
@emilymbender
After reading your post I had to watch again Ortho Artificial Intelligence short Video by DGlaucomflecken 😅🤣 https://youtu.be/WgnWgIOer6s?is=62_p13To_penRg-S
@emilymbender
Got round to reading this article and it's provided food for thought.
Any medical practitioner will be enticed by a solution that appears to reduce workload, particularly if it reduces the interaction with crappy electronic record systems like Powerchart (owned by Larry Ellison's Oracle).
For me, dictating a clinic letter at the end of a specialist consultation is an opportunity to take stock of all the important features, sift through the information and highlight the point of importance for other practitioners involved (including future me).
I'm not sure AI will do that properly, even if it understands my accent.
@emilymbender yuuuuup... dental office. I politely declined and they were cool with it. But will it be that way next time?
@emilymbender ah yeah I remember the diagnosed gender dysphoria situation because some machine learning program heard "male man" when a woman described herself as mailman
@emilymbender Luckily not been asked that
@emilymbender Excellent post. I worked for many years in healthcare. I know firsthand the incredible pressures on providers to find the time they need to give high-quality care while completing all of their administrative tasks. So I get why these AI tools are attractive. I have consented to have providers use them in my care. But I won’t any longer. The problems you describe are serious & potentially dangerous. I appreciate the perspective that documenting is part of care.
@emilymbender Yes, and I certainly decline. Fortunately, I have a good relationship with my GP, so it hasn't been an issue so far.
@emilymbender @DevlinLeathercraft The orthopedic surgeon who will be taking care of my trigger thumb asked to record our last session. I can't remember whether I asked him if an AI was going to transcribe it, but I will next time.
@emilymbender These kinds of scribes have been commonplace in medical scenarios for a long time at this point
@emilymbender Agree. GPs are low availability now, saying no to this means being viewed as difficult, maybe ejected from patient roster. So you can't really say no.
Also in two visits where reports were prepped from specialists, there were errors from AI transcription mishearing that I think a human would not have made (age cited quite differently in different paragraphs, a operation claimed as had which was spoken as DID NOT have, etc) Correction required my time, effort, Dr disfavor 🫤
Yes, and of course said no.
But then I discovered they used AI transcribing when adding notes to my journal, after the meetings, as it was full of obvious errors. So needed to lecture them again about my right it is not used on my medical record.
What makes this even worse is that they all know how bad it works, as it is frequently reported in media about complaints from the medical community about horrific errors, as well as inefficiency this overhyped piece of crap creates.
@emilymbender no but yesterday, I did see a poster mentioning it at a local clinic
@emilymbender thankfully my therapist was like "yeah dude don't worry about it it's weird" but i still get an email alongside every 'upcoming appointment' email reminding me to sign the permission form
@emilymbender
One of my first jobs was providing tech support to doctors in a hospital setting. They were some of the most tech-illiterate folks I've ever encountered. They have no concept of operational security.
No doctor has ever asked me for permission to store any information about me in whatever systems they're using. For all I know they store it in plain text on an insecure S3 bucket.
@emilymbender no but in the agreement they ask us to sign periodically it said that they might use AI. So I said I wasn’t signing if they were going to. They asked the doc and she said no I don’t use AI transcription at all and I didn’t know that was in there!
@emilymbender
I have-- and refused!
@emilymbender Funnily enough, transcribing can preserve privacy without issues. Whisper.cpp runs decently well on phones, and can be run on servers that process patient records under the same security constraints. Could easily be run locally even.
Problem is, that’s extremely hard to prove in the current „just slap a gear on it and call it steampunk” climate. I would definitely not trust a random provider.
And if they do „summarization”, forget privacy.
@emilymbender psychiatry did it without informed consent. I am livid
@emilymbender for a doctors perspective on the more profound side effects of “efficiency”
https://benngooch.substack.com/p/i-was-an-enthusiastic-early-adopter
“I felt myself becoming a passive observer in encounters where I had previously been an active architect. I felt my clinical memory, my narrative identity, and my sense of connection to my patients beginning to erode at the edges.”
@emilymbender Really helpful. Thanks for sharing.
I often reason about it this way. There are very few things like this where, if you opt out this time, you can’t opt in next time. On the other hand, there are LOTS of situations, this being a good example, where opting out after you opted in is substantially more effort (or even impossible).
Opting out by default is usually a safe thing to do. You can always opt in later if you change your mind.
@emilymbender
Evidence shows litigation decrease if Drs have scribes. A Dr isn’t allowed to remember things in defence. It’s said “If it is not documented, it didn’t happen” even if it did happen & recall can be verified.
The direct effect:
1: more litigation = more insurance cost for the Dr & thus higher consult fees.
2: Drs who have psychological & emotional injury from spurious claims reduce/stop practice.
So there is high motivation for having a scribe.
As some others pointed out - that evidence is based on recorded encounters being transcribed by humans, or possibly by non-AI speech-to-text.
Nobody has yet studied the resulting practice effects of AI/LLM-based interpretation. Recording, transcription, and interpretation are separate realms and skill sets and “AI” is either unnecessary or unproven in all 3.
@emilymbender I opted out at my physical therapist last week. They told me all their patients who work in tech have opted out.
yes i was a declined. when she asked, i said because
1. the companies used sites with CSAM and other abuses
2. it’s spyware. each prompt acts like a honey-pot. since you are giving them the info, it by-passes HIPPA. in turn they get to use and sell that info however they please
3. as an antifascist activist, it puts my life in danger by giving companies ran by fascists access to my whole medical history.
my MD was shocked. they had no idea about the spyware angle or CSAM
@emilymbender Yes, I was asked to sign a consent (stuck in with the other standard consents) authorizing the doctor’s practice to use an AI scribe. I left the room, went up to the front desk and told them I would not sign the consent under any circumstances. They looked a little surprised, but agreed to have one of the techs act as a scribe as normal. Glad I stood my ground - there is no way in Hell I would let a Doc use AI for anything medical related
@emilymbender Im fortunate my gp doesn't even trust the national health database.