Refuse to let your doctor record you
speckx
149 points
196 comments
April 24, 2026
Related Discussions
Found 5 related stories in 56.5ms across 5,498 title embeddings via pgvector HNSW
- Stop microphones from recording your voice structuredPizza · 20 pts · March 29, 2026 · 51% similar
- Parents are refusing routine preventive care for newborns geox · 20 pts · March 21, 2026 · 40% similar
- Show HN: Adentris (YC P25) – Find mistakes in your medical records digitaltzar · 11 pts · April 02, 2026 · 40% similar
- NHS staff refusing to use FDP over Palantir ethical concerns chrisjj · 323 pts · April 03, 2026 · 40% similar
- Stop the Interviews mooreds · 19 pts · March 05, 2026 · 38% similar
Discussion Highlights (20 comments)
josefritzishere
This is a hard no.
pclowes
I understand the concerns and I am not sure I would allow myself to be recorded until I knew more. However, I do think we are in a situation where everybody knows that healthcare costs need to come down that doctors and medical professionals are spread too thin, forced to see evermore patients in the same number of hours, and yet for every attempt to improve efficiency there is a “no, not that way“ response.
scrawl
> The false promise of efficiency [...] that is extremely unlikely to mean more time with each patient. Instead, it will mean more patients. nit: that is a real efficiency gain. seeing more patients sounds better on the face of it.
walrus01
It's interesting how lots of service providers of all sorts will insist that you agree to their Terms of Service, Acceptable Use Policy, End User License Agreement (or whatever they want to call it) before engaging with you, but when the consumer insists on enforcing their own personal policy in the opposite direction such as refusing consent to recording or feeding your PII into some opaque AI system, suddenly it's a problem.
k2xl
I think the post conflates two issues: 1. AI-generated charting. 2. The existence of a reliable record of the visit. I am skeptical of the first in some cases (i.e. bias), but strongly in favor of the second. My father is 80 and has Parkinson’s. He routinely leaves appointments unsure of what the doctor said, what changed, or what he is supposed to do next. Even when I attend with him, we sometimes disagree afterward about what exactly was recommended. This happens with pediatric appointments too. My wife and I occasionally remember instructions differently: medication timing, symptoms to watch for, when to call back, whether something was “normal” or needed follow-up. That is a care quality problem, not just a convenience problem. The risks are real: privacy, consent, retention, training use, liability, and automation bias. But those argue for strict controls, not for a blanket refusal. Make it opt-in, give the patient access, prohibit training without explicit consent, keep retention short, and require clear auditability. I do not want opaque AI quietly rewriting the medical record. But I also do not think “everyone relies on memory after a stressful 12-minute appointment” is some gold standard we should preserve.
nubinetwork
Dead link (or it was?)
adit_ya1
Almost every point follows the same structure: > "Here is a real concern about implementation" → "Therefore you should refuse entirely" This skips the middle step of "therefore we should implement it well." I'm not convinced that we should be allowing doctors to record patient visits at this stage yet, but I'm really not convinced by these points, which largely don't hold up under closer examination. A few that stuck out: "Privacy" - Labs are routinely sent to third-party companies, and we don't do informed consent for that. The third-party argument isn't unique to recording. "False promise of efficiency" - This doesn't really have anything to do with patients at all. It's a criticism of medical office management, not of physician-patient interactions. Telling patients to refuse a tool because management might exploit the productivity gains is asking patients to fight a labor battle on the provider's behalf. "Consent can't be revoked mid-visit" - Consent typically can't be revoked in the middle of an appendectomy, or halfway through administering a vaccine either. Practical irrevocability is a normal feature of informed consent, not a special problem unique to recording. Proper consent processes in medical offices are a broader issue than consent about voice recordings specifically. Had the authors made the point that providers are being asked to obtain consent for tools whose technical implementation and privacy risks fall outside the provider's own domain knowledge — that would be a stronger argument. But that isn't quite the point they made, and their current framing doesn't wholly convince.
kube-system
There might be some real concern about the cognitive and patient-interaction impacts of speech recognition being used... but on the other hand, it's more likely that details are missed when information is captured manually. And the privacy/informed consent concerns here are silly, they apply to any of your charted data... and if you're going to any office that doesn't use the latest technology, your patient information is probably being sent between offices over fax anyway.
burnte
I'm a healthcare CIO of 12 years, and I've evaluated 4 and deployed 2 of these tools, one of which is currently deployed at my currently healthcare employer. I am very measured on AI but the results I've seen from these virtual scribes is HUGE. In every case we have IMMEDIATELY seen improvements in patient NPS scores, provider satisfaction, and note quality. Notes are more standardized as well as more verbose and detailed, which makes it easier for future providers to understand the case. These better notes reduce our claim rejection rate. And what converted me was direct patient response. Across the board patient feedback is extremely positive, with the most common comment being along the lines of "I really felt like the doctor connected with me better and they were more present in the visit." These AI scribes really DO improve patient care, I've seen it with my own eyes.
varispeed
I always agree if this is for academic purposes, if it helps with research etc. I can't see why I shouldn't. We are just meat that will expire one day.
xxpor
This was extremely unconvincing for me. The site is now 500ing for me, so I can't fully quote it, but the arguments about privacy just fall flat. You don't know about Epic's, or GE's, or Philips' security either. You have to trust the institution of HIPPA et al overall to at least make things right. I really don't care if my recording becomes training data. I would rather be spoken to like I'm not an idiot. Use technical terms please. I want precision. Calling the US healthcare system underfunded might be the most wild part of the whole thing. We spend 5.3 trillion dollars a year. That's 17% of the entire economy.
impatient_bacon
Oof yea I just got surprised by this at a vet appointment for my dog, weirded me out. I just went along with it to get the visit over with and I can see the benefit of having an accurate record of the visit, but we'll have to come to terms to the reality of this invasive surveillance as a society at some point I imagine.
dlcarrier
I'm more concerned about a record being made in general, than how it is made. If were to be affected by a tragedy and visit a psychologist or psychiatrist to receive support, it would likely require a diagnosis of depression to get insurance coverage, and having that on my record could make it more difficult and costly to legally fly an airplane or own a gun, and who knows what else.
the_gipsy
I live in a country with free public healthcare. In a recent doctor's visit, the doctor was interviewing me while a nurse was typing into the computer. Presumably so that the doctor would have more time to attend patients and so that she wouldn't get distracted. It's fascinating how this translates to the idea that in the USA, this should mean "more time with patients", but in reality also means "more patients", but is somehow bad because the is a monetary drive.
cromka
This is seriously a good example of a domain that should enforce on-premises AI. Doctors absolutely can afford to buy an NVIDIA workstation. Transcribing text is not exactly super demanding, comparatively speaking. When did we even stop considering non-cloud services? If AI boomed 10 years ago, we wouldn't even be discussing this.
daedrdev
HIPPA exists and has a lot of teeth. Given this extensive liability, I trust that if anything does go wrong they will be punished. Recordings might dramatically improve patient outcomes, and so I will let them
jimt1234
This situation is real. I've had the same doctor my entire adult life (~25 years). We've got a pretty informal relationship. I even saw her hammered at a bar one night, and had to give her a ride home because her friends were also drunk AF. Anyway, a few years ago, during an annual checkup, she asked how my family was doing and I made a joke about my brother drinking too much. A few weeks later I started receiving pamphlets in the mail about treating alcoholism, ads for rehab centers. I just brushed it off, didn't make any connection. Then, the next year, during my annual checkup, my doctor wasn't available, so I got a different doctor, someone I'd never talked to in my life. She immediately started asking me about my drinking. I fired back, asking WTF she was talking about. She said, "Oh, well your file says alcoholism runs in your family.", and then started lecturing me about getting over the shame of alcoholism is the first step to beating it. I don't even drink. No one in my family drinks other than my brother. He was drinking a lot at that time because he favorite NFL team (LA Rams) was doing really well, and he was celebrating a lot. And it was just a joke. The next year, during my annual checkup, I gave my doctor a load of crap, telling her to record nothing I say unless I explicitly tell her to. She tried to defend the system, but she agreed. I'm still upset that my "file" still mentions alcoholism.
OutOfHere
Why do we even need to consult doctors anymore? Just let the AI decide. Docs should be freed up for up for doing physical tests and interventions, or otherwise for providing more training data for the AI in cases where the AI isn't producing results or when a second look is urgently needed in an emergency situation.
pavel_lishin
Down for me, but Internet Archive grabbed a copy: https://web.archive.org/web/20260424151739/https://buttondow...
jll29
Advice: Regardless of whether you opt in or out, you should only permit anyone to record you if you get a copy of the recording for your own records.