‘Why am I paying for their time?’: Doctor caught using ChatGPT has everyone arguing

bnew

Veteran
Joined
Nov 1, 2015
Messages
63,579
Reputation
9,747
Daps
173,673

‘Why am I paying for their time?’: Doctor caught using ChatGPT has everyone arguing​


‘Genuinely I think you should lose your medical license for this.’​


Rachel Kiley

Posted on May 9 2025 9:30 am CDT

Split screen with doctor looking at computer screen on left and tweet on right.

@mayankja1n/X

A doctor’s usage of ChatGPT during a patient visit has social media users debating the future of medical treatment.

“Took my dad in to the doctor cus he sliced his finger with a knife and the doctor was using ChatGPT,” Mayank Jain (@mayankja1n) shared on X in early May. “Based on the chat history, it’s for every patient.”

The pictures he took showed that the doctor had input basic information—”left index finger laceration with kitchen knife surgical referral for wound care and sutures”—and had ChatGPT spit out a response that included a History of Present Illness (HPI), medical history, physical exam results, and a final Assessment and Plan.

It’s unclear whether this information was all based on previous input from the doctor or whether ChatGPT was simply filling in the blanks.

The screen also showed a series of previous chats that appeared to mostly be related to diagnoses, with one Bible-related chat thrown in for good measure.



Should doctors use ChatGPT?​


As everyone reckons with how AI may be integrated into our daily lives moving forward, the idea of it being utilized by medical professionals raises specific concerns.

Some have made the argument that AI has access to a vast wealth of knowledge far beyond what any single person could retain on their own. In theory, this could lead to faster diagnoses, catch potentially dangerous drug interactions, or assist with other critical care.


There’s also been the suggestion that doctors using ChatGPT and the like to cut down on administrative tasks like filling out charts is ultimately a good thing, as it could allow them to spend more time actually seeing patients.

But even that sort of usage potentially raises HIPAA concerns in the United States.

“Physicians can opt out of having OpenAI use the information to train ChatGPT,” Christian Hetrick wrote in a 2023 article for USC. “But regardless of whether you’ve opted out, you’ve just violated HIPAA because the data has left the health system.”

This doesn’t apply in every scenario. Doctors consulting AI or keeping identifying patient info out of it could be within the bounds of what’s allowable. But as the world rapidly shifts to become more reliant on AI, it’s hard to predict how much data will be offered up to these third party systems, let alone how secure it will be.



Social media reacts​


As a cut finger doesn’t require deep medical knowledge, most people weren’t too bothered by this particular instance of AI usage, including Jain. However, even the possibility of having to deal with long waits, hefty insurance premiums, and additional payments just to have a doctor check with ChatGPT—something we could all do at home—sparked frustration.

In Body Image
@IAmMikeRana/X

In Body Image
@mayankja1n/X

In Body Image
@realdocspeaks/X

In Body Image
@stevemur/X

In Body Image
@erixfilming/X

In Body Image
@stateoofgracie/X

Some folks have just given up on doing more than finding amusement in all of this chaos, and honestly, who can blame them?

In Body Image
@alex_valaitis/X

In Body Image
@JeremyNguyenPhD/X

In Body Image
@inductionheads/X

In Body Image
@DanielSamanez3/X

This may not be the future we wanted, but maybe it’s the future we deserve.
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
63,579
Reputation
9,747
Daps
173,673




1/37
@mayankja1n
Took my dad in to the doctor cus he sliced his finger with a knife and the doctor was using ChatGPT 😂

Based on the chat history, it’s for every patient.



GqEga6-XsAA9xHY.jpg

GqEga65WkAAz39e.jpg


2/37
@JeremyNguyenPhD
Did you like that he was using ChatGPT or were you disappointed?

How about your Dad?

Also: 4o?! Dude, if I'm going to a doctor, the least they could do is use o3



3/37
@mayankja1n
I just found it funny. I was only paying attention to his screen to think of AI tools to sell to doctors anyways.

He was using it for patient notes and didn’t give it any identifying information that would violate patient privacy so I’m proud of that doctor for getting with the times.



4/37
@Bfaviero
pretty sure he's just doing it to short-cut the writeup, not the diagnosis itself



5/37
@mayankja1n
Ya exactly. Not much to diagnose with a cut finger anyways.

I wouldn’t even care if he’s using to diagnose since he still applies his knowledge to differentials, it’s just like an intern suggesting ideas that he has to ultimately approve.

But regardless, since it’s for the basic use case of just generating notes, don’t think he’s even doing anything technically disallowed.



6/37
@aj_kourabi
i actually think this is great, looks like its saving him time on writing up post visit notes



7/37
@mayankja1n
Ya I agree. As long as there’s a human in the loop. Anyways he was just doing the medical equivalent of generating a readme file.

The only reason I was paying attention to his screen in the first place was to see his process to think of AI tools to make life easier 😂



8/37
@inductionheads




GqH1_J3XAAADRR7.jpg


9/37
@kregenrek
Peanut M&M Carbs Calculation???



GqGXUVyWcAAX4EF.png


10/37
@SeedOilDsrspctr
This looks more like he is using this to help with documentation which I totally understand



11/37
@stevemur
Hope the doctor didn’t have one of these signs



GqHbIVwbAAAscEs.jpg


12/37
@DrSiyabMD
Looks like he is using it to generate his note and not do the actual doctoring part. Excellent use of AI, saves a lot of time



13/37
@Aizkmusic
Nothing wrong with this EXCEPT he's not on an org account (the footer is the standard account). Org accounts (allegedly) don't train data on inputs.



14/37
@bnj
Looks like he’s using it for paperwork and not a diagnosis



15/37
@BasedDeptGnrl
This isn't concerning...



GqIsDTMXoAAVQ-4.jpg


16/37
@andrewdsouza
At least use o3. Good God man. What did they teach you in medical school



17/37
@Saul_Loveman
This is illegal in the United States… it breaks hipaa compliance. Your doctors can just give data to @sama ? Wild



18/37
@DefiApes
Looks like he’s just using it to create a note template, which takes up most of our time and is considered “busy work”

I do hope he’s personalizing it a bit tho according to the patient and not just copy pasting lol



19/37
@RogerSeheult
He should be using a HIPPA compliant platform like in Doximity (needs to have a BAA). We are coming out with a MedCram video on this in the next day or so. In the meantime, here’s a blog on the topic.

Revolutionizing Clinical Workflow: How HIPAA-Compliant AI Like Doximity GPT Streamlines Patient Care - Medcram Blog



20/37
@michaelgrowth
Well this is hyper illegal (depending on country).

Can’t just use a PERSONAL ChatGPT to feed it potentially sensitive info lol

Granted would it be ok to google questions / check medical books for confirmations?

If not, why?



21/37
@pastelETH
by the way, doctors SHOULD be doing this (and then obviously cross-referencing their findings with official documentation)



22/37
@KadriJibraan
the vibe doctor



23/37
@BTCGandalf
I want my doctor to use chatGPT as well.



24/37
@KieranO
Next time request he uses o3 😂



25/37
@garybasin
This is fine? Faster than copy pasting a word doc



26/37
@ajsharp
My brother in Christ, you’re an MD, pay the $200/mo for the smarter machines please



27/37
@thanosthinking
we’re cooked



28/37
@sunny051488
I have never seen a doctor dress like that inside a medical building



29/37
@ruslanjabari
raw doggin chatgpt is crazy



30/37
@SalehOfTomorrow
Vibe diagnostics



31/37
@here4impact
glad he is funneling patient data into openai's talons



32/37
@thanosthinking
malpractice



33/37
@amaldorai
Looks like he saw a patient who was considering sacrificing his son.



34/37
@prmshra


[Quoted tweet]
By the way, it’s actually really smart he’s not using o3. This is just for notes. Why waste time and compute on o3 for something this basic?


35/37
@punk3178
Any doctor not using it is not fulfilling their potential



36/37
@irl_danB
this is great and should be encouraged if not expected



37/37
@RealSamRogers
Yo @grok translate this



GqJxvWZXQAA5Ksd.jpg



To post tweets in this format, more info here: https://www.thecoli.com/threads/tips-and-tricks-for-posting-the-coli-megathread.984734/post-52211196
 
Last edited:

O.T.I.S.

Veteran
Joined
Sep 15, 2013
Messages
77,288
Reputation
16,529
Daps
298,425
Reppin
The Truth
Everyone uses ChatGPT to help with their job in 2025.
Exactly

Isn’t this what this is for?

This isn’t no different to me googling the commands for a switch I’ve never used before.


I’m no fan of this “AI wave” or fad but a licensed doctor who went to school and understands HOW to do what came out of ChatGPT >>> rando idiots on internet.


And it sounds like it pulled up information on the father, which I have no problem with if thats the case. It might be a specialized version for the medical field. Who knows… who cares at this point. Isn’t thats what it’s built for… to assist?
 

bnew

Veteran
Joined
Nov 1, 2015
Messages
63,579
Reputation
9,747
Daps
173,673
Most likely using it to write reports.

once you set a system prompt/prompt to respond with a specific output template, it can be huge time saver.

edit:
Code:
You are an expert medical scribe and clinical documentation specialist. Your task is to generate professional, structured medical reports from brief clinical inputs. Follow these instructions precisely to create standardized documentation that adheres to medical best practices.

## Input Processing Instructions

- Before generating any report, ask the user for all pertinent information including:
  * Chief complaint and mechanism of injury
  * Timing of the incident/onset
  * Current symptoms and severity
  * Allergies, medications, and medical history
  * Any examination findings they've observed
  * Planned or recommended interventions
- Wait for the user's response before proceeding with report generation
- Parse all available information and expand using standard medical terminology and phrasing
- Make reasonable clinical inferences when appropriate (e.g., timing, severity, common associated symptoms)
- Do not fabricate major clinical details not suggested by the input

## Output Format Requirements

Always structure your response as a formal medical note with the following sections:

1. **History of Present Illness (HP)**
   - Begin with "HP." on the first line
   - Describe the presenting problem including onset, duration, severity, and associated symptoms
   - Use complete sentences in professional clinical language
   - Include the mechanism of injury when provided

2. **Medical Background**
   - List "Allergies:", "Medications:", and "Medical History:" on separate lines
   - Under "Medical History:", include a subsection labeled "General:"
   - If not specified in the input, use "None" as the default value
   - Format these as single-line entries for clarity

3. **Physical Exam**
   - Start with "Physical Exam:" header
   - Organize findings by body system or region (e.g., "Skin:", "General:", "Neurovascular:")
   - Document relevant positive and negative findings
   - Use clinical terminology (e.g., "No acute distress", "Sensation intact")

4. **Assessment and Plan**
   - Begin with "Assessment and Plan:" header
   - Clearly state the diagnosis or clinical impression
   - Document recommended treatments, referrals, and follow-up instructions
   - Include patient education points when appropriate

## Style Guidelines

- Use concise, objective medical language throughout
- Write in third person, present tense for exam findings
- Avoid subjective language, excessive detail, or speculation
- Maintain a professional, clinical tone
- Do not include patient identifiers unless explicitly provided
- Format output as plain text with clear section separation

## Example Output Template

HP.
Patient presents with [chief complaint] [mechanism of injury if applicable]. The [injury/condition] occurred [timeframe]. Patient reports [symptoms] and [severity]. [Additional relevant history].

Allergies: [List or None]
Medications: [List or None]
Medical History: 
General: [General medical history or None]

Physical Exam:
[System/Region]: [Findings]
[System/Region]: [Findings]
[System/Region]: [Findings]

Assessment and Plan:
[Diagnosis/impression]. [Treatment plan]. [Referrals if applicable]. [Patient instructions]. [Follow-up recommendations].

## After Generating Report

After generating the report, ask the user if they would like to make any amendments or if they would like to keep the report as is.
 
Last edited:

tuckgod

The high exalted
Joined
Feb 4, 2016
Messages
50,784
Reputation
15,448
Daps
185,451
Exactly

Isn’t this what this is for?

This isn’t no different to me googling the commands for a switch I’ve never used before.


I’m no fan of this “AI wave” or fad but a licensed doctor who went to school and understands HOW to do what came out of ChatGPT >>> rando idiots on internet.


And it sounds like it pulled up information on the father, which I have no problem with if thats the case. It might be a specialized version for the medical field. Who knows… who cares at this point. Isn’t thats what it’s built for… to assist?
People just need things to be upset about and people they think are doing better than them are the most satisfying targets, brother.
 
Top