Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
https://buttondown.com/maiht3k/archive/why-you-should-refuse-to-let-your-doctor-record/
Post
Have you been asked by a medical provider recently for consent to have an "AI" scribe record your visit? Us, too. And we have **thoughts**
https://buttondown.com/maiht3k/archive/why-you-should-refuse-to-let-your-doctor-record/
@emilymbender Usually I ask my doctor to turn that off.
@emilymbender
Completely agree! I have started having conversations with my patients about the fact that I will never use this technology and to warn them that some institutions/practices have decided they don’t even need patient consent. Aside from all of the privacy issues, it is literally a major part of my job to create detailed, accurate and nuanced notes.
@emilymbender I tried to subscribe to that newsletter and it said "this e-mail address can not be subscribed" Why does it not like my email address?
@emilymbender I do share a lot of AI skepticism, but physician perspective (I use it about 25-30% of visits), there are many highly speculative aspects of this take:
🧵 1/2
1) Point #1 is valid, however, the same data safety questions can be asked regarding other integrated systems. Like where is your EMR data stored, how does your radiology data integrate (reviewed in 3rd party software), etc.
2) Consent: valid concern, but the fullest version would be a long EULA-like text with a checkbox...
@emilymbender Thanks for the write-up! How do you feel about human scribes? I've been saying "yes to human scribes, no to AI scribes" for a while now, but your list makes me realize a lot of the concerns apply there, too.
@emilymbender My provider always asks and always said that the days "doesn't leave our server." Are there really transcription/summarizing AIs that are completely local?
@emilymbender I finally said "no" the last time I went in... and I found out the nurse also wasn't really fond of it either.
@emilymbender I've spotted medically significant errors in the transcription during the session. It's galling.
@emilymbender I have been asked multiple times by medical providers this year if I would consent to AI transcription, and I have said no every time. I'm tempted to ask next time if they have information from an independent audit of the system performance and privacy policy
@emilymbender YES, multiple times, and with wording and timing of the "request" that amounted to coercion. Very little information given in response to questions about what tool exactly was being used, where my data would be sent, their privacy policies etc. Just "this makes it easier for me to help you."
Also recently at the emergency vet hospital, and again in a context that made it extremely difficult to refuse, lest you be seen as a difficult or problem client that creates extra work for the poor vets.
So glad you are writing about this.
@emilymbender I've been asked multiple times, and I always chicken out when they ask and say yes, I think it's the power imbalance, and I don't think it's fair.
Refuse! Don't do the doctor's job of taking notes, let alone support their choice to have AI transcribe notes for them. The accuracy rate is poor, but incorrect notes would still be used to treat you.
Yes, and by a couple of specialists who it has taken a lot of time/money to get in to see, so I've felt almost coerced into agreeing as I didn't want to risk them declining to see me so I consented against my better judgement.
Do you think there are other ways we can fight this outside of directly declining consent in these situations? One of the providers I see who has been using one for ~2 years now is the only person I've been able to see locally who can prescribe a tightly controlled medication, so I'm quite reluctant to risk the (tenuous) relationship I have established (I already don't feel like I can be completely open and honest with her, and fear what may happen if I decline the use of the scribe).
@emilymbender Great article. The privacy of my protected health information is my immediate concern. There are many ways it can and does leak. It’s not the provider’s fault, it is in the whole chain of info systems backing them up. Using an LLM multiplies the risk. Training data is EXTREMELY valuable. There is no way the recording or transcript are getting deleted. It’s getting sold. The incentives to are all wrong to ensure HIPAA compliance.
@emilymbender Yes. I think this is becoming quite common. Medical groups are providing this service through enterprise-wide medical information systems. I was grateful to be asked. I’ll just add that doctors do not like lectures from patients.
@meltedcheese Nowhere am I suggesting that patients lecture their doctors, though?
This information can help patients make decisions for themselves about this. If consent is being requested (as it should be), then 'no' is a complete sentence.
@emilymbender Agreed. I’m sorry that I miscommunicated. => I am the one who “lectured” and only because AI is my area of deep expertise. If I can convince a doctor or two to at least ask the right questions and consult with other doctors before simply accepting the use of LLM technology, that’s a good thing. Patients should have the info, as you say, to make their own decisions.
@emilymbender THANK YOU for point 9. It drives me batty how poorly people (read: management) rate the importance of mental review, slower moments, and backburner thinking for improving or maintaining skills. Something something working out is 10% active work and 90% fueling/rest.
Not to mention that a rushed/stressed provider makes ME stressed and less likely to be candid or “be complicated”, which is not the point of a visit!
@emilymbender yesterday, and I said no.
thanks for putting out a newsletter talking about it
@emilymbender yes. I said no. That I was very much against it.
@emilymbender yes. I said no. Said I was very against it.
My doc did say she was going to use it, but I said no. She said it's easier for her. I said no. She no longer brings the thing in the room. My cardiologist, however, just put the device on the desk & I said no. He started to hem & haw, reached to turn it on, then said it wasn't working. I talked about lots of things, and how it felt like I had a basketball in my gut. The summary "he" wrote said that I play basketball for exercise. I'm 71 with health issues. He lied.
@emilymbender The booking system wouldn't let me proceed without providing consent. I'm allowed to opt-out by providing a written request; I've not (yet) done that.
@BoydStephenSmithJr That ... isn't really consent.
@emilymbender My biggest concern is the potential for psychiatric violence. Inaccurate medical notes produced by these systems could very easily be used as evidence of psychosis or some other kind of psychopathology, leading to forced medical treatment. Having already experienced some of that system, it really worries me. I don’t let medical providers use these systems with me.
@emilymbender there are signs at the doctor's offoce saying you can refuse, but when I did I got a lecture on how this helps, and acting like I had no clue what I was talking about. I mentioned I worked in tech and it was dismissed. As I am in an area with few doctors accepting new patients at the moment.... how do I really refuse?
My therapist asked for permission, I declined, and after my session we got into a long conversation about why. At least they were curious about it.
@commonst @emilymbender Medical providers are one to point fingers at patients for being tech naïve. Medical providers, and the medical industry in general, are notoriously the worst at being informed about tech; worse than any industry short of lawyers. That’s actually why HIPAA exists.
@emilymbender other than "AI is just fucking wrong a lot?"
@the_turtle You could read the post that I was linking to, or you could add to the deluge of mansplaining characteristic of the Fediverse.
You chose option #2.
@emilymbender still, AI is just fucking wrong a lot.
@the_turtle And this is still mansplaining.
@emilymbender@dair-community.social see that block buttton? FUCKING USE IT THEN, SELF-IMPORTANT SHINY LIGHTS ON A SCREEN!
Doctor at Kaiser did ask. I asked what happened if I refused, and they actually weren't able to tell me. So I assume they are recording everything regardless of what I say.
Not a doctor. But have had parent's accountant/fin advisor had had AI transcribe last couple of meetings.
@emilymbender I had an appointment last week (Kaiser Permanente) and my doctor asked if I was fine with being recorded and she explained that it made it easier for her to write up a report later.
No mention of AI but that's not to say that's the real reason.
@netopwibby Oof -- so she asked if you were okay being recorded but did not provide info on what was going to happen to the recording?
@emilymbender Aside from “taking notes,” nope. Didn't seem weird at the time so I didn't probe.
Yep. And I said no. She initially said not to worry because it's all deleted afterwards. I said that, no, it is not. That's not how LLM's work. All that data remains in there somewhere and can be hacked, plus I don't want anything about me used to train those things on principal. She didn't argue.
@emilymbender I've noticed a lot of this use in veterinary medicine recently as well, just FYI.
@emilymbender And, now that I'm thinking about it- I have rarely seen any question put to the client if they want to consent or not. (This is some from work in a clinic and as a client/observer from a couple of clinics.)